Bitcoin Stock Prediction Using Deep Learning and Sentiment Analysis


This notebook analyzes Bitcoin historical values and how we can predict next values for Bitcoin closing prices

Introduction

What is a crypto currency and why Bitcoin?

According to wikipedia, a cryptocurrency is a digital asset designed to work as a medium of exchange using "cryptography" to secure the transactions and to control the creation of additional units of the currency by "mining".

While the best-known example of a cryptocurrency is Bitcoin, there are more than 100 other tradable cryptocurrencies, called altcoins (meaning alternative to Bitcoin), competing each other and with Bitcoin.

The motive behind this competition is that there are a number of design flaws in Bitcoin, and people are trying to invent new coins to overcome these defects hoping their inventions will eventually replace Bitcoin.

To June 2017, the total market capital of all cryptocurrencies is 102 billion in USD, 41 of which is of Bitcoin. Therefore, regardless of its design faults, Bitcoin is still the dominant cryptocurrency in markets. As a result, many altcoins cannot be bought with fiat currencies, but only be traded against Bitcoin.

Hence, I chose Bitcoin as my commodity in order to make wiser future investments for my cryptocurrency portfolio.

The ubiquity of Internet access has triggered the emergence of currencies distinct from those used in the prevalent monetary system. The advent of cryptocurrencies based on a unique method called “mining” has brought about significant changes in the online economic activities of users.

Cryptocurrencies are primarily characterized by fluctuations in their price and number of transactions [1][2]. Although Bitcoin was first introduced in 2008 [2][3], it had witnessed no significant fluctuation in its price and number of transactions until the end of 2013 [2], when it began to garner worldwide attention, and witnessed a significant rise and fluctuation in its price and number of transactions. Such unstable fluctuations have served as an opportunity for speculation for some users while hindering most others from using cryptocurrencies [1][4][5]

Methods

My research will follow a comparative approach. My first framework is a Recurrent Neural Network trained on 3 popular stock market indicators and past prices as key data points to find an optimal technique for cryptocurrency stock market prediction.

My second framework is a sequential model, trained on the sentiment of the public company news history and past prices as key data points, consisting of a single Long Short-Term Memory (LSTM) layer to generate a prediction vector for the whole input sequence and 1 Linear Dense Layer to aggregate the data into a single value.

Comparison will be made on the basis of their performance. Both techniques have some advantages and disadvantages. My research will analyze advantages and limitations of these techniques to find which technique is comparatively better for specifically Bitcoin stock market prediction.

Background

In a traditional recurrent neural network, during the gradient back-propagation phase, the gradient signal can end up being multiplied a large number of times (as many as the number of time steps) by the weight matrix associated with the connections between the neurons of the recurrent hidden layer. This means that, the magnitude of weights in the transition matrix can have a strong impact on the learning process.

If the weights in this matrix are small (or, more formally, if the leading eigenvalue of the weight matrix is smaller than 1.0), it can lead to a situation called vanishing gradients where the gradient signal gets so small that learning either becomes very slow or stops working altogether. It can also exacerbate the task of learning long-term dependencies in the data. Conversely, if the weights in this matrix are large (or, again, more formally, if the leading eigenvalue of the weight matrix is larger than 1.0), it can lead to a situation where the gradient signal is so large that it can cause learning to diverge. This is often referred to as exploding gradients.

These issues are the main motivation behind the LSTM model which introduces a new structure called a memory cell. A memory cell is composed of four main elements: an input gate, a neuron with a self-recurrent connection (a connection to itself), a forget gate and an output gate. The self-recurrent connection has a weight of 1.0 and ensures that, barring any outside interference, the state of a memory cell can remain constant from one time step to another.

alt text

The gates serve to modulate the interactions between the memory cell itself and its environment. The input gate can allow incoming signal to alter the state of the memory cell or block it. On the other hand, the output gate can allow the state of the memory cell to have an effect on other neurons or prevent it. Finally, the forget gate can modulate the memory cell’s self-recurrent connection, allowing the cell to remember or forget its previous state, as needed.

I believe the biggest difference between the NLP and financial analysis is that language has some guarantee of structure, it’s just that the rules of the structure are vague. Markets, on the other hand, don’t come with a promise of a learnable structure, that such a structure exists is the assumption that this project would prove or disprove (rather it might prove or disprove if I can find that structure).

Assuming that a structure exists; the idea of summarizing the current state of the market in the same way we encode the semantics of a paragraph seems plausible to me.

A Data-Driven Approach To Cryptocurrency Speculation

How do Bitcoin markets behave? What are the causes of the sudden spikes and dips in cryptocurrency values? How can we predict what will happen next?

Research on the attributes of cryptocurrencies has made steady progress but has a long way to go. Most researchers analyze user sentiments related to cryptocurrencies on social media, e.g., Twitter, or quantified Web search queries on search engines, such as Google, as well as fluctuations in price and trade volume to determine any relation [6–10]. Past studies have been limited to Bitcoin because the large amount of data that it provides eliminates the need to build a model to predict fluctuations in the price and number of transactions of diverse cryptocurrencies.

Articles on cryptocurrencies, such as Bitcoin, are rife with speculation these days, with hundreds of self-proclaimed experts advocating for the trends that they expect to emerge. What is lacking from many of these analyses is a strong data analysis foundation to backup the claims.

So, I felt that analysis of the top headlines on the first page of Google News results for the term Bitcoin to predict its closing price for the next day seemed like the most unbiased approach to resolving the biased opinions strewn around the web. I also felt exclusion of the "Price" suffix was justified by the fact that its inclusion led to Google News returning those biased news articles as opposed to just news revolving around Bitcoin that I feel is more relevant for my unbiased prediction of closing prices.

This approach also resonates with my personal approach to track the closing prices of the cryptocurrencies I have invested in. I always find myself skimming through the most important headlines on the first page of the Google News results. Extrapolation of my research method to apply to a sample of the population representing crytocurrency investors also seems fair. Don't agree? This article on Bloomberg Markets does!

In a nutshell, the article suggests that, according to Google Trends, global searches for “buy bitcoin” have overtaken “buy gold” after previously exceeding searches for how to purchase silver.

Note - Numbers represent search interest relative to the highest point on the chart for the given region and time. A value of 100 is the peak popularity for the term. A value of 50 means that the term is half as popular. Likewise a score of 0 means the term was less than 1% as popular as the peak.

alt text

For the rest of the skeptics, I decided to explore Google Trends myself to evaluate the legitimacy of my claim. So the global search for bitcoin for the time period starting from January 3, 2009; when the first Bitcoin transaction record, or genesis block, kicked off the Bitcoin blockchain and included a reference to a pertinent newspaper headline of that day:

The Times 03/Jan/2009 Chancellor on brink of second bailout for banks.

Sources:

looks like this:

alt text

Input Data

Initial Features Set

My initial features set include the Adjusted Open, Adjusted High, Adjusted Low, Adjusted Close, Adjusted Volume for BTC, Adjusted Volume for Currency and Weighted Price for Bitcoin retrieved using Quandl's free Bitcoin API for dates ranging from January 7, 2017 to December 12, 2017.

I used pickle to serialize and save the downloaded data as a file, which will prevent my script from re-downloading the same data each time I run the script.

In [1]:
# Define Quandl Helper Function to download and cache bitcoin dataset from Quandl
import json
import numpy as np
import pandas as pd
import pickle
import quandl
from datetime import datetime
import plotly.offline as py
import plotly.graph_objs as go
import plotly.figure_factory as ff
py.init_notebook_mode(connected=True)

with open('api_key.json') as f:
    api = json.load(f)
quandl.ApiConfig.api_key = api["api_key"]

def get_quandl_data(quandl_id):
    #Download and cache Quandl dataseries
    cache_path = '{}.pkl'.format(quandl_id).replace('/','-')
    try:
        f = open(cache_path, 'rb')
        df = pickle.load(f)   
        print('Loaded {} from cache'.format(quandl_id))
    except (OSError, IOError) as e:
        print('Downloading {} from Quandl'.format(quandl_id))
        df = quandl.get(quandl_id, returns="pandas", start_date="2014-01-07", end_date="2017-12-12")
        df.to_pickle(cache_path)
        print('Cached {} at {}'.format(quandl_id, cache_path))
    return df

# Pull Kraken BTC exchange historical pricing data
btc_usd_price_kraken = get_quandl_data('BCHARTS/KRAKENUSD')

btc_usd_price_kraken.head()
Downloading BCHARTS/KRAKENUSD from Quandl
Cached BCHARTS/KRAKENUSD at BCHARTS-KRAKENUSD.pkl
Out[1]:
Open High Low Close Volume (BTC) Volume (Currency) Weighted Price
Date
2014-01-07 874.67040 892.06753 810.00000 810.00000 15.622378 13151.472844 841.835522
2014-01-08 810.00000 899.84281 788.00000 824.98287 19.182756 16097.329584 839.156269
2014-01-09 825.56345 870.00000 807.42084 841.86934 8.158335 6784.249982 831.572913
2014-01-10 839.99000 857.34056 817.00000 857.33056 8.024510 6780.220188 844.938794
2014-01-11 858.20000 918.05471 857.16554 899.84105 18.748285 16698.566929 890.671709
In [2]:
# Chart the BTC close pricing data
btc_trace = go.Scatter(x=btc_usd_price_kraken.index, y=btc_usd_price_kraken['Close'])
py.iplot([btc_trace])

Observations

There are a few notable down-spikes, particularly in late 2014 and early 2016. These spikes are specific to the Kraken dataset, and I obviously don't want them to be reflected in my overall pricing analysis.

The nature of Bitcoin exchanges is that the pricing is determined by supply and demand, hence no single exchange contains a true "master price" of Bitcoin. To solve this issue, along with that of down-spikes, I pulled data from three more major Bitcoin changes to calculate an aggregate Bitcoin price index.

In [3]:
# Pull pricing data for 3 more BTC exchanges
exchanges = ['COINBASE','BITSTAMP','ITBIT']

exchange_data = {}

exchange_data['KRAKEN'] = btc_usd_price_kraken

for exchange in exchanges:
    exchange_code = 'BCHARTS/{}USD'.format(exchange)
    btc_exchange_df = get_quandl_data(exchange_code)
    exchange_data[exchange] = btc_exchange_df
Downloading BCHARTS/COINBASEUSD from Quandl
Cached BCHARTS/COINBASEUSD at BCHARTS-COINBASEUSD.pkl
Downloading BCHARTS/BITSTAMPUSD from Quandl
Cached BCHARTS/BITSTAMPUSD at BCHARTS-BITSTAMPUSD.pkl
Downloading BCHARTS/ITBITUSD from Quandl
Cached BCHARTS/ITBITUSD at BCHARTS-ITBITUSD.pkl
In [4]:
# Merge All Of The Pricing Data Into A Single Dataframe
def merge_dfs_on_column(dataframes, labels, col):
    # Merge a single column of each dataframe into a new combined dataframe
    series_dict = {}
    for index in range(len(dataframes)):
        series_dict[labels[index]] = dataframes[index][col]
        
    return pd.DataFrame(series_dict)

# Merge the BTC price dataseries' into a single dataframe on their "Close Price" column
btc_usd_datasets_close = merge_dfs_on_column(list(exchange_data.values()), list(exchange_data.keys()), 'Close')
btc_usd_datasets_open = merge_dfs_on_column(list(exchange_data.values()), list(exchange_data.keys()), 'Open')
btc_usd_datasets_high = merge_dfs_on_column(list(exchange_data.values()), list(exchange_data.keys()), 'High')
btc_usd_datasets_low = merge_dfs_on_column(list(exchange_data.values()), list(exchange_data.keys()), 'Low')

btc_usd_datasets_close.tail()
Out[4]:
BITSTAMP COINBASE ITBIT KRAKEN
Date
2017-12-08 15800.00 16367.03 16158.92 15921.9
2017-12-09 14607.49 15309.98 15062.92 14682.9
2017-12-10 14691.00 15290.01 15301.96 14850.3
2017-12-11 16470.00 16885.76 16907.28 16578.0
2017-12-12 16650.01 17730.12 17333.51 16889.3
In [5]:
# Visualize The Pricing Datasets

# Helper function to provide a single-line command to compare each column in the dataframe
def df_scatter(df, title, seperate_y_axis=False, y_axis_label='', scale='linear', initial_hide=False):
    #Generate a scatter plot of the entire dataframe
    label_arr = list(df)
    series_arr = list(map(lambda col: df[col], label_arr))
    
    layout = go.Layout(
        title=title,
        legend=dict(orientation="h"),
        xaxis=dict(type='date'),
        yaxis=dict(
            title=y_axis_label,
            showticklabels= not seperate_y_axis,
            type=scale
        )
    )
    
    y_axis_config = dict(
        overlaying='y',
        showticklabels=False,
        type=scale )
    
    visibility = 'visible'
    if initial_hide:
        visibility = 'legendonly'
        
    # Form Trace For Each Series
    trace_arr = []
    for index, series in enumerate(series_arr):
        trace = go.Scatter(
            x=series.index, 
            y=series, 
            name=label_arr[index],
            visible=visibility
        )
        
        # Add seperate axis for the series
        if seperate_y_axis:
            trace['yaxis'] = 'y{}'.format(index + 1)
            layout['yaxis{}'.format(index + 1)] = y_axis_config    
        trace_arr.append(trace)

    fig = go.Figure(data=trace_arr, layout=layout)
    py.iplot(fig)
    
# Plot all of the BTC exchange closing prices
df_scatter(btc_usd_datasets_close, 'Bitcoin Closing Price (USD) By Exchange')

Observations

Although the four series follow roughly the same path, there are various irregularities in each that should be eliminated. Since the price of Bitcoin has never been equal to zero in the timeframe that I was examining it makes sense to remove all of the zero values from the combined dataframe.

In [6]:
# Remove "0" values
btc_usd_datasets_close.replace(0, np.nan, inplace=True)
btc_usd_datasets_open.replace(0, np.nan, inplace=True)
btc_usd_datasets_high.replace(0, np.nan, inplace=True)
btc_usd_datasets_low.replace(0, np.nan, inplace=True)

# Plot the cleaned dataframe
df_scatter(btc_usd_datasets_close, 'Bitcoin Closing Price (USD) By Exchange')
In [7]:
# Calculate the average BTC closing price as a new column
btc_usd_datasets_close['avg_btc_close_price_usd'] = btc_usd_datasets_close.mean(axis=1)
btc_usd_datasets_open['avg_btc_open_price_usd'] = btc_usd_datasets_open.mean(axis=1)
btc_usd_datasets_high['avg_btc_high_price_usd'] = btc_usd_datasets_high.mean(axis=1)
btc_usd_datasets_low['avg_btc_low_price_usd'] = btc_usd_datasets_low.mean(axis=1)

# Plot the average BTC closing price
btc_trace = go.Scatter(x=btc_usd_datasets_close.index, y=btc_usd_datasets_close['avg_btc_close_price_usd'])
py.iplot([btc_trace])
In [8]:
btc_usd_datasets_close_final = btc_usd_datasets_close['avg_btc_close_price_usd'].copy()
btc_usd_datasets_open_final = btc_usd_datasets_open['avg_btc_open_price_usd'].copy()
btc_usd_datasets_high_final = btc_usd_datasets_high['avg_btc_high_price_usd'].copy()
btc_usd_datasets_low_final = btc_usd_datasets_low['avg_btc_low_price_usd'].copy()

btc_usd_datasets_close_final = btc_usd_datasets_close_final.reset_index()
btc_usd_datasets_open_final = btc_usd_datasets_open_final.reset_index()
btc_usd_datasets_high_final = btc_usd_datasets_high_final.reset_index()
btc_usd_datasets_low_final = btc_usd_datasets_low_final.reset_index()

btc_usd_datasets_open_final.columns = ['Date','Average Open Price (USD)']
btc_usd_datasets_high_final.columns = ['Date','Average High Price (USD)']
btc_usd_datasets_low_final.columns = ['Date','Average Low Price (USD)']
btc_usd_datasets_close_final.columns = ['Date','Average Close Price (USD)']

btc_usd_datasets_close_final.head()
Out[8]:
Date Average Close Price (USD)
0 2014-01-07 795.333333
1 2014-01-08 825.656790
2 2014-01-09 838.876447
3 2014-01-10 848.229587
4 2014-01-11 897.770750
In [9]:
btc_usd_datasets_final_1 = pd.merge(btc_usd_datasets_open_final, btc_usd_datasets_high_final, on='Date')
btc_usd_datasets_final_2 = pd.merge(btc_usd_datasets_low_final, btc_usd_datasets_close_final, on='Date')                                  
btc_usd_datasets_final = pd.merge(btc_usd_datasets_final_1, btc_usd_datasets_final_2, on='Date') 
btc_usd_datasets_final.to_csv('BTC_USD.csv', index=False)
btc_usd_datasets_final.head()                                   
Out[9]:
Date Average Open Price (USD) Average High Price (USD) Average Low Price (USD) Average Close Price (USD)
0 2014-01-07 906.386867 921.252577 795.333333 795.333333
1 2014-01-08 792.000000 862.560037 774.666667 825.656790
2 2014-01-09 827.741783 856.000000 794.042547 838.876447
3 2014-01-10 832.822867 851.162920 816.462867 848.229587
4 2014-01-11 869.610000 909.429470 867.225180 897.770750

Secondary Features Set

This features set builds out my dataframe from a csv file named BTC_USD.csv that I generated by extending the protocol governing the definition of my initial features set.

The model in theory

I am going to use 4 features: The close price and three extra technical indicators.

  • MACD (Trend Indicator)
  • Stochastics (Momentum Indicator)
  • Average True Range (Volume Indicator)

Functions

Trend Indicator

What is a trend?

Exponential Moving Average: Is a type of infinite impulse response filter that applies weighting factors which decrease exponentially. The weighting for each older datum decreases exponentially, never reaching zero. It is similar to a simple moving average, except that more weight is given to the latest data. It's also known as the exponentially weighted moving average. This type of moving average reacts faster to recent price changes than a simple moving average. The 12- and 26-day EMAs are the most popular short-term averages and they are used to create indicators like the moving average convergence divergence (MACD) and the percentage price oscillator (PPO).

alt text

MACD: The Moving Average Convergence/Divergence oscillator (MACD) is one of the simplest and most effective momentum indicators available. The MACD turns two trend-following indicators, moving averages, into a momentum oscillator by subtracting the longer moving average from the shorter moving average.

Momentum Indicator

What is momentum?

Stochastics oscillator: The Stochastic Oscillator is a momentum indicator that shows the location of the close relative to the high-low range over a set number of periods. It measures whether the closing price of a security is closer to the high or the low. It is based on the assumption that when a market is trending upward, the closing price will be closer to the highest price, and, when it is trending downward, the closing price will be closer to the lowest price.

Volume Indicator

What is volume?

Average True Range: Is an indicator to measure the volalitility (NOT price direction). The true range indicator is the largest of:

  • Method A: Current High less the current Low
  • Method B: Current High less the previous Close (absolute value)
  • Method C: Current Low less the previous Close (absolute value)

The average true range is a moving average, generally 14 days, of the true ranges. Basically a stock experiencing a high level of volatility has a higher ATR, and a low volatility stock has a lower ATR.

Calculation:

In [10]:
def MACD(df,period1,period2,periodSignal):
    EMA1 = pd.DataFrame.ewm(df,span=period1).mean()
    EMA2 = pd.DataFrame.ewm(df,span=period2).mean()
    MACD = EMA1-EMA2
    
    Signal = pd.DataFrame.ewm(MACD,periodSignal).mean()
    
    Histogram = MACD-Signal
    
    return Histogram

def stochastics_oscillator(df,period):
    l, h = pd.DataFrame.rolling(df, period).min(), pd.DataFrame.rolling(df, period).max()
    k = 100 * (df - l) / (h - l)
    return k

def ATR(df,period):
    '''
    Method A: Current High less the current Low
    '''
    df['H-L'] = abs(df['Average High Price (USD)']-df['Average Low Price (USD)'])
    '''
    Method B: Current High less the previous Close (absolute value)
    '''
    df['H-PC'] = abs(df['Average High Price (USD)']-df['Average Close Price (USD)'].shift(1))
    '''
    Method C: Current Low less the previous Close (absolute value)
    '''
    df['L-PC'] = abs(df['Average Low Price (USD)']-df['Average Close Price (USD)'].shift(1))
    TR = df[['H-L','H-PC','L-PC']].max(axis=1)
    return TR.to_frame()

Read Data

In [11]:
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
pd.options.mode.chained_assignment = None

df = pd.read_csv('BTC_USD.csv',usecols=[1,2,3,4])

dfPrices = pd.read_csv('BTC_USD.csv',usecols=[4])
In [12]:
dfPrices.head(2)
Out[12]:
Average Close Price (USD)
0 795.333333
1 825.656790

Plot Data

Plotting Price

In [13]:
price = dfPrices.iloc[len(dfPrices.index)-60:len(dfPrices.index)].as_matrix().ravel()
In [14]:
prices = dfPrices.iloc[len(dfPrices.index)-60:len(dfPrices.index)].as_matrix().ravel()
plt.figure(figsize=(25,7))
plt.plot(prices,label='Test',color='orange')
plt.title('Price')
plt.legend(loc='upper left')
plt.show()

Plotting MACD

In [15]:
macd = MACD(dfPrices.iloc[len(dfPrices.index)-60:len(dfPrices.index)],12,26,9)
In [16]:
plt.figure(figsize=(25,7))
plt.plot(macd,label='macd',color='blue')
plt.title('MACD')
plt.legend(loc='upper left')
plt.show()

Plotting Stochastics Oscillator

In [17]:
stochastics = stochastics_oscillator(dfPrices.iloc[len(dfPrices.index)-60:len(dfPrices.index)],14)
In [18]:
plt.figure(figsize=(14,7))
#First 100 points due to extreme density
plt.plot(stochastics[0:100],label='Stochastics Oscillator',color='red')
plt.title('Stochastics Oscillator')
plt.legend(loc='upper left')
plt.show()

Plotting Average True Range

In [19]:
atr = ATR(df.iloc[len(df.index)-60:len(df.index)],14)
In [20]:
plt.figure(figsize=(21,7))
#First 100 points due to extreme density
plt.plot(atr[0:100],label='ATR',color='green')
plt.title('Average True Range')
plt.legend(loc='upper left')
plt.show()

Create Final Dataframe and Save Data

In [21]:
dfPriceShift = dfPrices.shift(-1)
dfPriceShift.rename(columns={'Average Close Price (USD)':'Average Close Price Target (USD)'}, inplace=True)
dfPriceShift.head(2)
Out[21]:
Average Close Price Target (USD)
0 825.656790
1 838.876447
In [22]:
macd = MACD(dfPrices,12,26,9)
macd.rename(columns={'Average Close Price (USD)':'MACD'}, inplace=True)
In [23]:
stochastics = stochastics_oscillator(dfPrices,14)
stochastics.rename(columns={'Average Close Price (USD)':'Stochastics'}, inplace=True)
In [24]:
atr = ATR(df,14)
atr.rename(columns={0:'ATR'}, inplace=True)
In [25]:
final_data = pd.concat([dfPrices,dfPriceShift,macd,stochastics,atr], axis=1)
# Delete the entries with missing values (where the stochastics couldn't be computed yet) 
final_data = final_data.dropna()
In [26]:
final_data.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 1422 entries, 13 to 1434
Data columns (total 5 columns):
Average Close Price (USD)           1422 non-null float64
Average Close Price Target (USD)    1422 non-null float64
MACD                                1422 non-null float64
Stochastics                         1422 non-null float64
ATR                                 1422 non-null float64
dtypes: float64(5)
memory usage: 66.7 KB
In [27]:
final_data
Out[27]:
Average Close Price (USD) Average Close Price Target (USD) MACD Stochastics ATR
13 831.045463 821.300000 -1.712121 34.862388 30.227933
14 821.300000 819.029997 -1.690852 24.314425 25.633297
15 819.029997 819.699457 -1.725941 22.067729 12.507147
16 819.699457 788.295283 -1.639052 22.730315 15.930397
17 788.295283 809.606667 -3.255442 0.000000 45.300023
18 809.606667 817.040530 -3.011648 28.132088 33.029407
19 817.040530 764.078500 -2.288495 51.953532 52.968600
20 764.078500 799.663333 -4.799199 0.000000 70.647533
21 799.663333 796.403333 -4.209485 44.735167 56.651800
22 796.403333 803.979233 -3.819920 45.139165 23.539337
23 803.979233 805.063457 -2.915930 55.718332 29.512983
24 805.063457 817.626600 -2.079047 57.232367 14.036330
25 817.626600 814.927155 -0.593037 74.775839 23.032430
26 814.927155 812.760000 0.338034 75.930955 17.557845
27 812.760000 806.130000 0.884600 85.075540 14.365930
28 806.130000 791.943333 0.855184 75.603698 13.771867
29 791.943333 767.717347 -0.071531 50.097724 21.833113
30 767.717347 713.044020 -2.260588 6.795473 37.372763
31 713.044020 689.930850 -7.280388 0.000000 113.600550
32 689.930850 695.456667 -12.040765 0.000000 45.809533
33 695.456667 694.777200 -14.548913 4.327330 59.288973
34 694.777200 682.844437 -15.795274 3.795232 159.333333
35 682.844437 669.218633 -16.858266 0.000000 85.801267
36 669.218633 620.770473 -17.858036 0.000000 51.119930
37 620.770473 673.779880 -21.137115 0.000000 61.588600
38 673.779880 656.165567 -18.964076 26.927994 152.420000
39 656.165567 617.910370 -17.840244 18.230170 31.860917
40 617.910370 645.203867 -18.895788 0.000000 63.604953
41 645.203867 630.740033 -16.908996 14.500877 59.407630
42 630.740033 627.860597 -15.713786 7.371973 35.839127
... ... ... ... ... ...
1405 5873.277500 6528.225000 -139.020185 0.000000 976.020000
1406 6528.225000 6619.692500 -153.286768 41.154653 994.770000
1407 6619.692500 7285.140000 -156.221843 46.902157 268.780000
1408 7285.140000 7819.490000 -109.686248 88.716595 716.822500
1409 7819.490000 7693.945000 -39.039513 100.000000 819.175000
1410 7693.945000 7778.510000 -0.978175 93.549266 457.812500
1411 7778.510000 8022.372500 27.977721 97.894372 394.295000
1412 8022.372500 8237.052500 60.994026 100.000000 390.737500
1413 8237.052500 8103.910000 93.655265 100.000000 347.510000
1414 8103.910000 8237.372500 99.945353 94.367378 570.640000
1415 8237.372500 8015.040000 106.466664 100.000000 240.382500
1416 8015.040000 8198.137500 87.277343 90.595450 266.140000
1417 8198.137500 8747.122500 79.579373 98.340380 442.822500
1418 8747.122500 9295.180000 106.408069 100.000000 609.452500
1419 9295.180000 9702.945000 156.538566 100.000000 772.857500
1420 9702.945000 9888.512500 211.249541 100.000000 444.137500
1421 9888.512500 9844.125000 251.251196 100.000000 368.540000
1422 9844.125000 9970.282500 262.808573 97.977392 2420.832500
1423 9970.282500 10870.280000 265.885063 100.000000 1673.732500
1424 10870.280000 10900.797500 318.138015 100.000000 1521.395000
1425 10900.797500 11237.525000 340.612591 100.000000 485.850000
1426 11237.525000 11630.035000 362.968602 100.000000 1294.332500
1427 11630.035000 11706.257500 388.118912 100.000000 721.005000
1428 11706.257500 13775.795000 391.286924 100.000000 390.605000
1429 13775.795000 16667.930000 521.517394 100.000000 2232.017500
1430 16667.930000 16061.962500 795.651645 100.000000 3896.180000
1431 16061.962500 14915.822500 910.456660 92.349675 3318.607500
1432 14915.822500 15033.317500 869.431672 76.235360 3111.735000
1433 15033.317500 16710.260000 809.567318 76.530998 2702.245000
1434 16710.260000 17150.735000 849.366749 100.000000 2408.890000

1422 rows × 5 columns

In [28]:
final_data.to_csv('BTC_USD_TechnicalIndicators.csv',index=False)

First Framework - Recurrent Neural Network trained on Initial Features (Historical Bitcoin Prices) and Secondary Features ( Fiat Currency Stock Market Technical Indicators)

In [29]:
import pandas as pd 
import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf
%matplotlib inline
/Users/Arjun/anaconda/lib/python3.6/site-packages/h5py/__init__.py:34: FutureWarning:

Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.

Read the data

In [30]:
df = pd.read_csv('BTC_USD_TechnicalIndicators.csv')
df.head(2)
Out[30]:
Average Close Price (USD) Average Close Price Target (USD) MACD Stochastics ATR
0 831.045463 821.300000 -1.712121 34.862388 30.227933
1 821.300000 819.029997 -1.690852 24.314425 25.633297

Normalize the data

In [31]:
dfNorm = (df - df.mean()) / (df.max() - df.min())
dfNorm.head()
Out[31]:
Average Close Price (USD) Average Close Price Target (USD) MACD Stochastics ATR
0 -0.018616 -0.019383 -0.008789 -0.220788 -0.012736
1 -0.019205 -0.019517 -0.008770 -0.326268 -0.013916
2 -0.019342 -0.019477 -0.008801 -0.348734 -0.017287
3 -0.019302 -0.021327 -0.008725 -0.342109 -0.016408
4 -0.021201 -0.020072 -0.010136 -0.569412 -0.008866

Defining the Hyperparameters

In [32]:
num_epochs = 500
total_series_length = len(df.index)
# Sequence Size
truncated_backprop_length = 3
# Number of neurons
state_size = 12 
num_classes = 1
num_features = 4
batch_size = 1
num_batches = total_series_length//batch_size//truncated_backprop_length
min_test_size = 100

print('The total length of the series is: {}'.format(total_series_length))
print('The current configuration gives us {} batches of {} observation each, where each one is looking {} steps in the past'.format(num_batches,batch_size,truncated_backprop_length))
The total length of the series is: 1422
The current configuration gives us 474 batches of 1 observation each, where each one is looking 3 steps in the past

Splitting the data into Training and Testing sets

In [33]:
dfTrain = dfNorm[df.index < num_batches*batch_size*truncated_backprop_length]


for i in range(min_test_size,len(dfNorm.index)):
    
    if(i % truncated_backprop_length*batch_size == 0):
        test_first_idx = len(dfNorm.index)-i
        break

dfTest =  dfNorm[df.index >= test_first_idx]
In [34]:
dfTrain.head()
Out[34]:
Average Close Price (USD) Average Close Price Target (USD) MACD Stochastics ATR
0 -0.018616 -0.019383 -0.008789 -0.220788 -0.012736
1 -0.019205 -0.019517 -0.008770 -0.326268 -0.013916
2 -0.019342 -0.019477 -0.008801 -0.348734 -0.017287
3 -0.019302 -0.021327 -0.008725 -0.342109 -0.016408
4 -0.021201 -0.020072 -0.010136 -0.569412 -0.008866
In [35]:
dfTest.head()
Out[35]:
Average Close Price (USD) Average Close Price Target (USD) MACD Stochastics ATR
1320 0.229252 0.204506 0.021086 0.430588 0.041945
1321 0.210649 0.204507 0.007767 0.100816 0.101170
1322 0.210649 0.188199 -0.005007 0.100827 0.058924
1323 0.193907 0.193255 -0.034349 -0.254283 0.103168
1324 0.199098 0.204309 -0.051670 -0.199283 0.087348
In [36]:
xTrain = dfTrain[['Average Close Price (USD)','MACD','Stochastics','ATR']].as_matrix()
yTrain = dfTrain['Average Close Price Target (USD)'].as_matrix()
In [37]:
print(xTrain[0:3],'\n',yTrain[0:3])
[[-0.01861575 -0.00878864 -0.2207879  -0.01273624]
 [-0.01920518 -0.00877007 -0.32626754 -0.01391607]
 [-0.01934248 -0.00880071 -0.34873449 -0.01728667]] 
 [-0.01938293 -0.01951667 -0.01947723]
In [38]:
xTest = dfTest[['Average Close Price (USD)','MACD','Stochastics','ATR']].as_matrix()
yTest = dfTest['Average Close Price Target (USD)'].as_matrix()
In [39]:
print(xTest[0:3],'\n',yTest[0:3])
[[ 0.22925166  0.02108565  0.43058822  0.04194472]
 [ 0.21064879  0.00776725  0.10081591  0.10117024]
 [ 0.21064939 -0.00500709  0.10082663  0.05892424]] 
 [0.2045064  0.20450699 0.18819902]

Visualizing the starting average close price data

In [40]:
start_avg_cp_train_trace = go.Scatter(y=xTrain[:,0])
layout = dict(title = 'Train Data (' + str(len(xTrain)) + ' data points)')
fig = dict(data=[start_avg_cp_train_trace], layout=layout)
py.iplot(fig)
In [41]:
start_avg_cp_test_trace = go.Scatter(y=xTest[:,0])
layout = dict(title = 'Test Data (' + str(len(xTest)) + ' data points)')
fig = dict(data=[start_avg_cp_test_trace], layout=layout)
py.iplot(fig)

Placeholders

In [42]:
batchX_placeholder = tf.placeholder(dtype=tf.float32,shape=[None,truncated_backprop_length,num_features],name='data_ph')
batchY_placeholder = tf.placeholder(dtype=tf.float32,shape=[None,truncated_backprop_length,num_classes],name='target_ph')

Weights and Biases

Since I have considered a 3 layer neural network comprising of:

  1. Input Layer
  2. Hidden Recurrent Layer
  3. Output Layer

and output is a result of linear activation of last layer of RNN; we need only a single pair Weight and Bias.

In [43]:
weight = tf.Variable(tf.truncated_normal([state_size,num_classes]))
bias = tf.Variable(tf.constant(0.1,shape=[num_classes]))
# Unpack
labels_series = tf.unstack(batchY_placeholder, axis=1)

Forward Pass (Unroll the cell)

Input to RNN

In [44]:
cell = tf.contrib.rnn.BasicRNNCell(num_units=state_size)

states_series, current_state = tf.nn.dynamic_rnn(cell=cell,inputs=batchX_placeholder,dtype=tf.float32)

states_series = tf.transpose(states_series,[1,0,2])

Backward Pass (Output)

In [45]:
last_state = tf.gather(params=states_series,indices=states_series.get_shape()[0]-1)
last_label = tf.gather(params=labels_series,indices=len(labels_series)-1)

Prediction, Loss and Optimizer

In [46]:
prediction = tf.matmul(last_state,weight) + bias
prediction
Out[46]:
<tf.Tensor 'add:0' shape=(?, 1) dtype=float32>
In [47]:
mse_loss = tf.reduce_mean(tf.squared_difference(last_label,prediction))
train_step = tf.train.AdamOptimizer(learning_rate=0.001).minimize(mse_loss)
/Users/Arjun/anaconda/lib/python3.6/site-packages/tensorflow/python/ops/gradients_impl.py:96: UserWarning:

Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory.

In [48]:
train_mse_loss_list = []
test_mse_loss_list = []
test_pred_list = []

with tf.Session() as sess:
    
    tf.global_variables_initializer().run()
    
    for epoch_idx in range(num_epochs):
                
        print('Epoch {}'.format(epoch_idx))
        
        for batch_idx in range(num_batches):
            start_idx = batch_idx * truncated_backprop_length
            end_idx = start_idx + truncated_backprop_length * batch_size
        
            
            batchX = xTrain[start_idx:end_idx,:].reshape(batch_size,truncated_backprop_length,num_features)
            batchY = yTrain[start_idx:end_idx].reshape(batch_size,truncated_backprop_length,1)
            
            feed = {batchX_placeholder : batchX, batchY_placeholder : batchY}
            
            # TRAIN
            _loss,_train_step,_pred,_last_label,_prediction = sess.run(
                fetches=[mse_loss,train_step,prediction,last_label,prediction],
                feed_dict = feed
            )
            
            train_mse_loss_list.append(_loss)
            
            if(batch_idx % 200 == 0):
                print('Step {} - MSE Loss: {:.6f}'.format(batch_idx,_loss))
                
    # TEST 
    for test_idx in range(len(xTest) - truncated_backprop_length):
        
        testBatchX = xTest[test_idx:test_idx+truncated_backprop_length,:].reshape((1,truncated_backprop_length,num_features))        
        testBatchY = yTest[test_idx:test_idx+truncated_backprop_length].reshape((1,truncated_backprop_length,1))

        feed = {batchX_placeholder : testBatchX,
            batchY_placeholder : testBatchY}

        # test_pred contains 'window_size' predictions, we want the last one
        m_loss,_last_state,_last_label,test_pred = sess.run([mse_loss,last_state,last_label,prediction],feed_dict=feed)
        # The last test_pred
        test_pred_list.append(test_pred[-1][-1])
        test_mse_loss_list.append(m_loss)
Epoch 0
Step 0 - MSE Loss: 0.001891
Step 200 - MSE Loss: 0.000329
Step 400 - MSE Loss: 0.000452
Epoch 1
Step 0 - MSE Loss: 0.002315
Step 200 - MSE Loss: 0.000233
Step 400 - MSE Loss: 0.003716
Epoch 2
Step 0 - MSE Loss: 0.080398
Step 200 - MSE Loss: 0.000138
Step 400 - MSE Loss: 0.000012
Epoch 3
Step 0 - MSE Loss: 0.003966
Step 200 - MSE Loss: 0.000093
Step 400 - MSE Loss: 0.000306
Epoch 4
Step 0 - MSE Loss: 0.004070
Step 200 - MSE Loss: 0.000059
Step 400 - MSE Loss: 0.000031
Epoch 5
Step 0 - MSE Loss: 0.000036
Step 200 - MSE Loss: 0.000153
Step 400 - MSE Loss: 0.000676
Epoch 6
Step 0 - MSE Loss: 0.005615
Step 200 - MSE Loss: 0.000067
Step 400 - MSE Loss: 0.000657
Epoch 7
Step 0 - MSE Loss: 0.009200
Step 200 - MSE Loss: 0.000057
Step 400 - MSE Loss: 0.000608
Epoch 8
Step 0 - MSE Loss: 0.009192
Step 200 - MSE Loss: 0.000041
Step 400 - MSE Loss: 0.000382
Epoch 9
Step 0 - MSE Loss: 0.006470
Step 200 - MSE Loss: 0.000038
Step 400 - MSE Loss: 0.000492
Epoch 10
Step 0 - MSE Loss: 0.010716
Step 200 - MSE Loss: 0.000025
Step 400 - MSE Loss: 0.000491
Epoch 11
Step 0 - MSE Loss: 0.012164
Step 200 - MSE Loss: 0.000016
Step 400 - MSE Loss: 0.000312
Epoch 12
Step 0 - MSE Loss: 0.008425
Step 200 - MSE Loss: 0.000012
Step 400 - MSE Loss: 0.000304
Epoch 13
Step 0 - MSE Loss: 0.009275
Step 200 - MSE Loss: 0.000010
Step 400 - MSE Loss: 0.000319
Epoch 14
Step 0 - MSE Loss: 0.009884
Step 200 - MSE Loss: 0.000009
Step 400 - MSE Loss: 0.000283
Epoch 15
Step 0 - MSE Loss: 0.008482
Step 200 - MSE Loss: 0.000007
Step 400 - MSE Loss: 0.000254
Epoch 16
Step 0 - MSE Loss: 0.007859
Step 200 - MSE Loss: 0.000007
Step 400 - MSE Loss: 0.000275
Epoch 17
Step 0 - MSE Loss: 0.008467
Step 200 - MSE Loss: 0.000006
Step 400 - MSE Loss: 0.000251
Epoch 18
Step 0 - MSE Loss: 0.007355
Step 200 - MSE Loss: 0.000006
Step 400 - MSE Loss: 0.000263
Epoch 19
Step 0 - MSE Loss: 0.007814
Step 200 - MSE Loss: 0.000005
Step 400 - MSE Loss: 0.000243
Epoch 20
Step 0 - MSE Loss: 0.006765
Step 200 - MSE Loss: 0.000006
Step 400 - MSE Loss: 0.000273
Epoch 21
Step 0 - MSE Loss: 0.007710
Step 200 - MSE Loss: 0.000004
Step 400 - MSE Loss: 0.000221
Epoch 22
Step 0 - MSE Loss: 0.005565
Step 200 - MSE Loss: 0.000006
Step 400 - MSE Loss: 0.000312
Epoch 23
Step 0 - MSE Loss: 0.008246
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000153
Epoch 24
Step 0 - MSE Loss: 0.002935
Step 200 - MSE Loss: 0.000010
Step 400 - MSE Loss: 0.000493
Epoch 25
Step 0 - MSE Loss: 0.009705
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000004
Epoch 26
Step 0 - MSE Loss: 0.000014
Step 200 - MSE Loss: 0.000024
Step 400 - MSE Loss: 0.001077
Epoch 27
Step 0 - MSE Loss: 0.012489
Step 200 - MSE Loss: 0.000014
Step 400 - MSE Loss: 0.000109
Epoch 28
Step 0 - MSE Loss: 0.004339
Step 200 - MSE Loss: 0.000021
Step 400 - MSE Loss: 0.000928
Epoch 29
Step 0 - MSE Loss: 0.014907
Step 200 - MSE Loss: 0.000017
Step 400 - MSE Loss: 0.000002
Epoch 30
Step 0 - MSE Loss: 0.002422
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000194
Epoch 31
Step 0 - MSE Loss: 0.000482
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000020
Epoch 32
Step 0 - MSE Loss: 0.000193
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000170
Epoch 33
Step 0 - MSE Loss: 0.001758
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000019
Epoch 34
Step 0 - MSE Loss: 0.000054
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000250
Epoch 35
Step 0 - MSE Loss: 0.003371
Step 200 - MSE Loss: 0.000010
Step 400 - MSE Loss: 0.000003
Epoch 36
Step 0 - MSE Loss: 0.000667
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000323
Epoch 37
Step 0 - MSE Loss: 0.003564
Step 200 - MSE Loss: 0.000016
Step 400 - MSE Loss: 0.000003
Epoch 38
Step 0 - MSE Loss: 0.004219
Step 200 - MSE Loss: 0.000005
Step 400 - MSE Loss: 0.000207
Epoch 39
Step 0 - MSE Loss: 0.000275
Step 200 - MSE Loss: 0.000006
Step 400 - MSE Loss: 0.000001
Epoch 40
Step 0 - MSE Loss: 0.000370
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000158
Epoch 41
Step 0 - MSE Loss: 0.000700
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000002
Epoch 42
Step 0 - MSE Loss: 0.000336
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000147
Epoch 43
Step 0 - MSE Loss: 0.001955
Step 200 - MSE Loss: 0.000004
Step 400 - MSE Loss: 0.000000
Epoch 44
Step 0 - MSE Loss: 0.001047
Step 200 - MSE Loss: 0.000005
Step 400 - MSE Loss: 0.000115
Epoch 45
Step 0 - MSE Loss: 0.002597
Step 200 - MSE Loss: 0.000005
Step 400 - MSE Loss: 0.000001
Epoch 46
Step 0 - MSE Loss: 0.002533
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000096
Epoch 47
Step 0 - MSE Loss: 0.002108
Step 200 - MSE Loss: 0.000003
Step 400 - MSE Loss: 0.000003
Epoch 48
Step 0 - MSE Loss: 0.000770
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000034
Epoch 49
Step 0 - MSE Loss: 0.000815
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 50
Step 0 - MSE Loss: 0.000003
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000089
Epoch 51
Step 0 - MSE Loss: 0.001969
Step 200 - MSE Loss: 0.000003
Step 400 - MSE Loss: 0.000033
Epoch 52
Step 0 - MSE Loss: 0.000212
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000064
Epoch 53
Step 0 - MSE Loss: 0.000317
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000017
Epoch 54
Step 0 - MSE Loss: 0.000757
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000097
Epoch 55
Step 0 - MSE Loss: 0.001517
Step 200 - MSE Loss: 0.000009
Step 400 - MSE Loss: 0.000000
Epoch 56
Step 0 - MSE Loss: 0.002231
Step 200 - MSE Loss: 0.000011
Step 400 - MSE Loss: 0.000067
Epoch 57
Step 0 - MSE Loss: 0.001620
Step 200 - MSE Loss: 0.000009
Step 400 - MSE Loss: 0.000006
Epoch 58
Step 0 - MSE Loss: 0.002237
Step 200 - MSE Loss: 0.000006
Step 400 - MSE Loss: 0.000116
Epoch 59
Step 0 - MSE Loss: 0.003541
Step 200 - MSE Loss: 0.000007
Step 400 - MSE Loss: 0.000003
Epoch 60
Step 0 - MSE Loss: 0.000421
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000058
Epoch 61
Step 0 - MSE Loss: 0.000450
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000015
Epoch 62
Step 0 - MSE Loss: 0.000056
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000055
Epoch 63
Step 0 - MSE Loss: 0.000437
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000034
Epoch 64
Step 0 - MSE Loss: 0.000046
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000089
Epoch 65
Step 0 - MSE Loss: 0.000643
Step 200 - MSE Loss: 0.000005
Step 400 - MSE Loss: 0.000024
Epoch 66
Step 0 - MSE Loss: 0.000274
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000070
Epoch 67
Step 0 - MSE Loss: 0.000235
Step 200 - MSE Loss: 0.000005
Step 400 - MSE Loss: 0.000004
Epoch 68
Step 0 - MSE Loss: 0.001422
Step 200 - MSE Loss: 0.000004
Step 400 - MSE Loss: 0.000090
Epoch 69
Step 0 - MSE Loss: 0.001023
Step 200 - MSE Loss: 0.000003
Step 400 - MSE Loss: 0.000000
Epoch 70
Step 0 - MSE Loss: 0.000619
Step 200 - MSE Loss: 0.000011
Step 400 - MSE Loss: 0.000176
Epoch 71
Step 0 - MSE Loss: 0.002911
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000008
Epoch 72
Step 0 - MSE Loss: 0.000142
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000076
Epoch 73
Step 0 - MSE Loss: 0.002052
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000016
Epoch 74
Step 0 - MSE Loss: 0.000242
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000050
Epoch 75
Step 0 - MSE Loss: 0.000839
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000037
Epoch 76
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000060
Epoch 77
Step 0 - MSE Loss: 0.000347
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000039
Epoch 78
Step 0 - MSE Loss: 0.000030
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000077
Epoch 79
Step 0 - MSE Loss: 0.000562
Step 200 - MSE Loss: 0.000003
Step 400 - MSE Loss: 0.000040
Epoch 80
Step 0 - MSE Loss: 0.000123
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000056
Epoch 81
Step 0 - MSE Loss: 0.000591
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000029
Epoch 82
Step 0 - MSE Loss: 0.000314
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000044
Epoch 83
Step 0 - MSE Loss: 0.000788
Step 200 - MSE Loss: 0.000002
Step 400 - MSE Loss: 0.000041
Epoch 84
Step 0 - MSE Loss: 0.000107
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000035
Epoch 85
Step 0 - MSE Loss: 0.000353
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000036
Epoch 86
Step 0 - MSE Loss: 0.000081
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000054
Epoch 87
Step 0 - MSE Loss: 0.000398
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000056
Epoch 88
Step 0 - MSE Loss: 0.000004
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000053
Epoch 89
Step 0 - MSE Loss: 0.000347
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000028
Epoch 90
Step 0 - MSE Loss: 0.000392
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000041
Epoch 91
Step 0 - MSE Loss: 0.000442
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000070
Epoch 92
Step 0 - MSE Loss: 0.000021
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000052
Epoch 93
Step 0 - MSE Loss: 0.000281
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000036
Epoch 94
Step 0 - MSE Loss: 0.000224
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000043
Epoch 95
Step 0 - MSE Loss: 0.000285
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000100
Epoch 96
Step 0 - MSE Loss: 0.000195
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000053
Epoch 97
Step 0 - MSE Loss: 0.000191
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000032
Epoch 98
Step 0 - MSE Loss: 0.000289
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000054
Epoch 99
Step 0 - MSE Loss: 0.000256
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000122
Epoch 100
Step 0 - MSE Loss: 0.000603
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000062
Epoch 101
Step 0 - MSE Loss: 0.000145
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000015
Epoch 102
Step 0 - MSE Loss: 0.001565
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000068
Epoch 103
Step 0 - MSE Loss: 0.000758
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000058
Epoch 104
Step 0 - MSE Loss: 0.000092
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000032
Epoch 105
Step 0 - MSE Loss: 0.000029
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000032
Epoch 106
Step 0 - MSE Loss: 0.000052
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000073
Epoch 107
Step 0 - MSE Loss: 0.000224
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000079
Epoch 108
Step 0 - MSE Loss: 0.000303
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000027
Epoch 109
Step 0 - MSE Loss: 0.000059
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000056
Epoch 110
Step 0 - MSE Loss: 0.000259
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000071
Epoch 111
Step 0 - MSE Loss: 0.000181
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000050
Epoch 112
Step 0 - MSE Loss: 0.000184
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000055
Epoch 113
Step 0 - MSE Loss: 0.000035
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000076
Epoch 114
Step 0 - MSE Loss: 0.000162
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000106
Epoch 115
Step 0 - MSE Loss: 0.000374
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000075
Epoch 116
Step 0 - MSE Loss: 0.000347
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000030
Epoch 117
Step 0 - MSE Loss: 0.001239
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000076
Epoch 118
Step 0 - MSE Loss: 0.000906
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000034
Epoch 119
Step 0 - MSE Loss: 0.000108
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000056
Epoch 120
Step 0 - MSE Loss: 0.000588
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000034
Epoch 121
Step 0 - MSE Loss: 0.000070
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000044
Epoch 122
Step 0 - MSE Loss: 0.000295
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000065
Epoch 123
Step 0 - MSE Loss: 0.000197
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000050
Epoch 124
Step 0 - MSE Loss: 0.000109
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000059
Epoch 125
Step 0 - MSE Loss: 0.000163
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000070
Epoch 126
Step 0 - MSE Loss: 0.000223
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000063
Epoch 127
Step 0 - MSE Loss: 0.000118
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000046
Epoch 128
Step 0 - MSE Loss: 0.000146
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000073
Epoch 129
Step 0 - MSE Loss: 0.000208
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000075
Epoch 130
Step 0 - MSE Loss: 0.000256
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000062
Epoch 131
Step 0 - MSE Loss: 0.000168
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000042
Epoch 132
Step 0 - MSE Loss: 0.000006
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000081
Epoch 133
Step 0 - MSE Loss: 0.000266
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000044
Epoch 134
Step 0 - MSE Loss: 0.000068
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000123
Epoch 135
Step 0 - MSE Loss: 0.001057
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000026
Epoch 136
Step 0 - MSE Loss: 0.000008
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000048
Epoch 137
Step 0 - MSE Loss: 0.000201
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000029
Epoch 138
Step 0 - MSE Loss: 0.000005
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000040
Epoch 139
Step 0 - MSE Loss: 0.000277
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000043
Epoch 140
Step 0 - MSE Loss: 0.000066
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000039
Epoch 141
Step 0 - MSE Loss: 0.000072
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000054
Epoch 142
Step 0 - MSE Loss: 0.000209
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000044
Epoch 143
Step 0 - MSE Loss: 0.000015
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000039
Epoch 144
Step 0 - MSE Loss: 0.000153
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000061
Epoch 145
Step 0 - MSE Loss: 0.000201
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000044
Epoch 146
Step 0 - MSE Loss: 0.000020
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000060
Epoch 147
Step 0 - MSE Loss: 0.000180
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000056
Epoch 148
Step 0 - MSE Loss: 0.000199
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000059
Epoch 149
Step 0 - MSE Loss: 0.000223
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000027
Epoch 150
Step 0 - MSE Loss: 0.000095
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000084
Epoch 151
Step 0 - MSE Loss: 0.000472
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000024
Epoch 152
Step 0 - MSE Loss: 0.000043
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000077
Epoch 153
Step 0 - MSE Loss: 0.001115
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 154
Step 0 - MSE Loss: 0.000208
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000036
Epoch 155
Step 0 - MSE Loss: 0.000121
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000034
Epoch 156
Step 0 - MSE Loss: 0.000076
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000035
Epoch 157
Step 0 - MSE Loss: 0.000130
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000042
Epoch 158
Step 0 - MSE Loss: 0.000161
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000037
Epoch 159
Step 0 - MSE Loss: 0.000045
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000050
Epoch 160
Step 0 - MSE Loss: 0.000258
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000044
Epoch 161
Step 0 - MSE Loss: 0.000043
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000039
Epoch 162
Step 0 - MSE Loss: 0.000178
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000048
Epoch 163
Step 0 - MSE Loss: 0.000135
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000042
Epoch 164
Step 0 - MSE Loss: 0.000161
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000057
Epoch 165
Step 0 - MSE Loss: 0.000293
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000049
Epoch 166
Step 0 - MSE Loss: 0.000104
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000041
Epoch 167
Step 0 - MSE Loss: 0.000217
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000045
Epoch 168
Step 0 - MSE Loss: 0.000213
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000052
Epoch 169
Step 0 - MSE Loss: 0.000171
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000046
Epoch 170
Step 0 - MSE Loss: 0.000185
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000064
Epoch 171
Step 0 - MSE Loss: 0.000289
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000039
Epoch 172
Step 0 - MSE Loss: 0.000042
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000084
Epoch 173
Step 0 - MSE Loss: 0.000322
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 174
Step 0 - MSE Loss: 0.000026
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000078
Epoch 175
Step 0 - MSE Loss: 0.001083
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000017
Epoch 176
Step 0 - MSE Loss: 0.000411
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000053
Epoch 177
Step 0 - MSE Loss: 0.000658
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000033
Epoch 178
Step 0 - MSE Loss: 0.000001
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000050
Epoch 179
Step 0 - MSE Loss: 0.000363
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000034
Epoch 180
Step 0 - MSE Loss: 0.000012
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000052
Epoch 181
Step 0 - MSE Loss: 0.000372
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000037
Epoch 182
Step 0 - MSE Loss: 0.000008
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000047
Epoch 183
Step 0 - MSE Loss: 0.000396
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000033
Epoch 184
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000059
Epoch 185
Step 0 - MSE Loss: 0.000455
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000028
Epoch 186
Step 0 - MSE Loss: 0.000008
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000054
Epoch 187
Step 0 - MSE Loss: 0.000491
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 188
Step 0 - MSE Loss: 0.000168
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000043
Epoch 189
Step 0 - MSE Loss: 0.000545
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000035
Epoch 190
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000045
Epoch 191
Step 0 - MSE Loss: 0.000345
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000030
Epoch 192
Step 0 - MSE Loss: 0.000001
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000050
Epoch 193
Step 0 - MSE Loss: 0.000431
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000034
Epoch 194
Step 0 - MSE Loss: 0.000004
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000039
Epoch 195
Step 0 - MSE Loss: 0.000414
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 196
Step 0 - MSE Loss: 0.000100
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000050
Epoch 197
Step 0 - MSE Loss: 0.000587
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 198
Step 0 - MSE Loss: 0.000157
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000052
Epoch 199
Step 0 - MSE Loss: 0.000681
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000031
Epoch 200
Step 0 - MSE Loss: 0.000125
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000054
Epoch 201
Step 0 - MSE Loss: 0.000796
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000029
Epoch 202
Step 0 - MSE Loss: 0.000029
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000060
Epoch 203
Step 0 - MSE Loss: 0.000867
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 204
Step 0 - MSE Loss: 0.000007
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000054
Epoch 205
Step 0 - MSE Loss: 0.000942
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000023
Epoch 206
Step 0 - MSE Loss: 0.000076
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000050
Epoch 207
Step 0 - MSE Loss: 0.000774
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 208
Step 0 - MSE Loss: 0.000032
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000050
Epoch 209
Step 0 - MSE Loss: 0.000723
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000026
Epoch 210
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000055
Epoch 211
Step 0 - MSE Loss: 0.000778
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 212
Step 0 - MSE Loss: 0.000121
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000050
Epoch 213
Step 0 - MSE Loss: 0.000766
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 214
Step 0 - MSE Loss: 0.000014
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000049
Epoch 215
Step 0 - MSE Loss: 0.000768
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 216
Step 0 - MSE Loss: 0.000045
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000051
Epoch 217
Step 0 - MSE Loss: 0.000735
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 218
Step 0 - MSE Loss: 0.000012
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000056
Epoch 219
Step 0 - MSE Loss: 0.000756
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000015
Epoch 220
Step 0 - MSE Loss: 0.000197
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000051
Epoch 221
Step 0 - MSE Loss: 0.000690
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000024
Epoch 222
Step 0 - MSE Loss: 0.000014
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000037
Epoch 223
Step 0 - MSE Loss: 0.000549
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000029
Epoch 224
Step 0 - MSE Loss: 0.000014
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000041
Epoch 225
Step 0 - MSE Loss: 0.000568
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000029
Epoch 226
Step 0 - MSE Loss: 0.000008
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000049
Epoch 227
Step 0 - MSE Loss: 0.000651
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 228
Step 0 - MSE Loss: 0.000048
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000049
Epoch 229
Step 0 - MSE Loss: 0.000871
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 230
Step 0 - MSE Loss: 0.000039
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000048
Epoch 231
Step 0 - MSE Loss: 0.000642
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 232
Step 0 - MSE Loss: 0.000161
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000043
Epoch 233
Step 0 - MSE Loss: 0.000837
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000027
Epoch 234
Step 0 - MSE Loss: 0.000011
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000046
Epoch 235
Step 0 - MSE Loss: 0.000756
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 236
Step 0 - MSE Loss: 0.000025
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000046
Epoch 237
Step 0 - MSE Loss: 0.000752
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 238
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000035
Epoch 239
Step 0 - MSE Loss: 0.000623
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000024
Epoch 240
Step 0 - MSE Loss: 0.000034
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000042
Epoch 241
Step 0 - MSE Loss: 0.000630
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 242
Step 0 - MSE Loss: 0.000018
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000043
Epoch 243
Step 0 - MSE Loss: 0.000512
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000024
Epoch 244
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000040
Epoch 245
Step 0 - MSE Loss: 0.000567
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000027
Epoch 246
Step 0 - MSE Loss: 0.000010
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000042
Epoch 247
Step 0 - MSE Loss: 0.000520
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000023
Epoch 248
Step 0 - MSE Loss: 0.000028
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000048
Epoch 249
Step 0 - MSE Loss: 0.000533
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 250
Step 0 - MSE Loss: 0.000020
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000039
Epoch 251
Step 0 - MSE Loss: 0.000726
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 252
Step 0 - MSE Loss: 0.000006
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000044
Epoch 253
Step 0 - MSE Loss: 0.000756
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 254
Step 0 - MSE Loss: 0.000035
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000051
Epoch 255
Step 0 - MSE Loss: 0.001039
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000016
Epoch 256
Step 0 - MSE Loss: 0.000046
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000050
Epoch 257
Step 0 - MSE Loss: 0.000992
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 258
Step 0 - MSE Loss: 0.000010
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000040
Epoch 259
Step 0 - MSE Loss: 0.000701
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000024
Epoch 260
Step 0 - MSE Loss: 0.000075
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000042
Epoch 261
Step 0 - MSE Loss: 0.000491
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000023
Epoch 262
Step 0 - MSE Loss: 0.000005
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000045
Epoch 263
Step 0 - MSE Loss: 0.000747
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 264
Step 0 - MSE Loss: 0.000003
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000031
Epoch 265
Step 0 - MSE Loss: 0.000787
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 266
Step 0 - MSE Loss: 0.000143
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000038
Epoch 267
Step 0 - MSE Loss: 0.000618
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 268
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000041
Epoch 269
Step 0 - MSE Loss: 0.000678
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 270
Step 0 - MSE Loss: 0.000001
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000040
Epoch 271
Step 0 - MSE Loss: 0.000720
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000015
Epoch 272
Step 0 - MSE Loss: 0.000003
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000035
Epoch 273
Step 0 - MSE Loss: 0.000733
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 274
Step 0 - MSE Loss: 0.000016
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000040
Epoch 275
Step 0 - MSE Loss: 0.000693
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 276
Step 0 - MSE Loss: 0.000001
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000037
Epoch 277
Step 0 - MSE Loss: 0.000573
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 278
Step 0 - MSE Loss: 0.000039
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000039
Epoch 279
Step 0 - MSE Loss: 0.000450
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000023
Epoch 280
Step 0 - MSE Loss: 0.000041
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000042
Epoch 281
Step 0 - MSE Loss: 0.000611
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 282
Step 0 - MSE Loss: 0.000006
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000040
Epoch 283
Step 0 - MSE Loss: 0.000620
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000024
Epoch 284
Step 0 - MSE Loss: 0.000041
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000037
Epoch 285
Step 0 - MSE Loss: 0.000559
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000026
Epoch 286
Step 0 - MSE Loss: 0.000033
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000037
Epoch 287
Step 0 - MSE Loss: 0.000408
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 288
Step 0 - MSE Loss: 0.000066
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000042
Epoch 289
Step 0 - MSE Loss: 0.000582
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000016
Epoch 290
Step 0 - MSE Loss: 0.000004
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000039
Epoch 291
Step 0 - MSE Loss: 0.000623
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 292
Step 0 - MSE Loss: 0.000041
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000043
Epoch 293
Step 0 - MSE Loss: 0.000764
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 294
Step 0 - MSE Loss: 0.000021
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000042
Epoch 295
Step 0 - MSE Loss: 0.000811
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 296
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000035
Epoch 297
Step 0 - MSE Loss: 0.000747
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 298
Step 0 - MSE Loss: 0.000016
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000037
Epoch 299
Step 0 - MSE Loss: 0.000685
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 300
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000038
Epoch 301
Step 0 - MSE Loss: 0.000616
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 302
Step 0 - MSE Loss: 0.000011
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000038
Epoch 303
Step 0 - MSE Loss: 0.000588
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 304
Step 0 - MSE Loss: 0.000006
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000035
Epoch 305
Step 0 - MSE Loss: 0.000589
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000024
Epoch 306
Step 0 - MSE Loss: 0.000035
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000035
Epoch 307
Step 0 - MSE Loss: 0.000471
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 308
Step 0 - MSE Loss: 0.000022
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000035
Epoch 309
Step 0 - MSE Loss: 0.000577
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 310
Step 0 - MSE Loss: 0.000007
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000034
Epoch 311
Step 0 - MSE Loss: 0.000736
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 312
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000040
Epoch 313
Step 0 - MSE Loss: 0.000848
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 314
Step 0 - MSE Loss: 0.000003
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000042
Epoch 315
Step 0 - MSE Loss: 0.000737
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 316
Step 0 - MSE Loss: 0.000014
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000042
Epoch 317
Step 0 - MSE Loss: 0.000738
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 318
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000041
Epoch 319
Step 0 - MSE Loss: 0.000822
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 320
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000039
Epoch 321
Step 0 - MSE Loss: 0.000800
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 322
Step 0 - MSE Loss: 0.000009
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000040
Epoch 323
Step 0 - MSE Loss: 0.000912
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 324
Step 0 - MSE Loss: 0.000023
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000038
Epoch 325
Step 0 - MSE Loss: 0.000870
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 326
Step 0 - MSE Loss: 0.000011
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000040
Epoch 327
Step 0 - MSE Loss: 0.000748
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 328
Step 0 - MSE Loss: 0.000007
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000041
Epoch 329
Step 0 - MSE Loss: 0.000813
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 330
Step 0 - MSE Loss: 0.000022
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000044
Epoch 331
Step 0 - MSE Loss: 0.001033
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000017
Epoch 332
Step 0 - MSE Loss: 0.000043
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000040
Epoch 333
Step 0 - MSE Loss: 0.000944
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 334
Step 0 - MSE Loss: 0.000001
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000040
Epoch 335
Step 0 - MSE Loss: 0.000667
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 336
Step 0 - MSE Loss: 0.000001
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000036
Epoch 337
Step 0 - MSE Loss: 0.000730
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 338
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000035
Epoch 339
Step 0 - MSE Loss: 0.000690
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 340
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000038
Epoch 341
Step 0 - MSE Loss: 0.000700
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 342
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000038
Epoch 343
Step 0 - MSE Loss: 0.000790
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 344
Step 0 - MSE Loss: 0.000002
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000037
Epoch 345
Step 0 - MSE Loss: 0.000687
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 346
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000041
Epoch 347
Step 0 - MSE Loss: 0.000767
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 348
Step 0 - MSE Loss: 0.000004
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000043
Epoch 349
Step 0 - MSE Loss: 0.000861
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000017
Epoch 350
Step 0 - MSE Loss: 0.000010
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000039
Epoch 351
Step 0 - MSE Loss: 0.000861
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 352
Step 0 - MSE Loss: 0.000012
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000037
Epoch 353
Step 0 - MSE Loss: 0.000587
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 354
Step 0 - MSE Loss: 0.000006
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000033
Epoch 355
Step 0 - MSE Loss: 0.000539
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 356
Step 0 - MSE Loss: 0.000022
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000037
Epoch 357
Step 0 - MSE Loss: 0.000634
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 358
Step 0 - MSE Loss: 0.000002
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000030
Epoch 359
Step 0 - MSE Loss: 0.000595
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000024
Epoch 360
Step 0 - MSE Loss: 0.000132
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000037
Epoch 361
Step 0 - MSE Loss: 0.000574
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 362
Step 0 - MSE Loss: 0.000009
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000039
Epoch 363
Step 0 - MSE Loss: 0.000727
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 364
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000035
Epoch 365
Step 0 - MSE Loss: 0.000787
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 366
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000036
Epoch 367
Step 0 - MSE Loss: 0.000690
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 368
Step 0 - MSE Loss: 0.000001
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000032
Epoch 369
Step 0 - MSE Loss: 0.000581
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000024
Epoch 370
Step 0 - MSE Loss: 0.000033
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000032
Epoch 371
Step 0 - MSE Loss: 0.000480
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 372
Step 0 - MSE Loss: 0.000015
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000032
Epoch 373
Step 0 - MSE Loss: 0.000571
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 374
Step 0 - MSE Loss: 0.000016
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000028
Epoch 375
Step 0 - MSE Loss: 0.000623
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 376
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000032
Epoch 377
Step 0 - MSE Loss: 0.000658
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 378
Step 0 - MSE Loss: 0.000001
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000029
Epoch 379
Step 0 - MSE Loss: 0.000641
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 380
Step 0 - MSE Loss: 0.000010
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000039
Epoch 381
Step 0 - MSE Loss: 0.000694
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 382
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000037
Epoch 383
Step 0 - MSE Loss: 0.000795
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 384
Step 0 - MSE Loss: 0.000007
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000034
Epoch 385
Step 0 - MSE Loss: 0.000651
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 386
Step 0 - MSE Loss: 0.000006
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000032
Epoch 387
Step 0 - MSE Loss: 0.000547
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 388
Step 0 - MSE Loss: 0.000027
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000028
Epoch 389
Step 0 - MSE Loss: 0.000488
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 390
Step 0 - MSE Loss: 0.000022
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000034
Epoch 391
Step 0 - MSE Loss: 0.000605
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 392
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000034
Epoch 393
Step 0 - MSE Loss: 0.000593
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 394
Step 0 - MSE Loss: 0.000020
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 395
Step 0 - MSE Loss: 0.000518
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 396
Step 0 - MSE Loss: 0.000076
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000032
Epoch 397
Step 0 - MSE Loss: 0.000454
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 398
Step 0 - MSE Loss: 0.000012
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000034
Epoch 399
Step 0 - MSE Loss: 0.000659
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 400
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000031
Epoch 401
Step 0 - MSE Loss: 0.000654
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000016
Epoch 402
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000029
Epoch 403
Step 0 - MSE Loss: 0.000589
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 404
Step 0 - MSE Loss: 0.000031
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000031
Epoch 405
Step 0 - MSE Loss: 0.000471
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 406
Step 0 - MSE Loss: 0.000020
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000030
Epoch 407
Step 0 - MSE Loss: 0.000527
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 408
Step 0 - MSE Loss: 0.000013
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000028
Epoch 409
Step 0 - MSE Loss: 0.000587
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 410
Step 0 - MSE Loss: 0.000005
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000027
Epoch 411
Step 0 - MSE Loss: 0.000581
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 412
Step 0 - MSE Loss: 0.000006
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 413
Step 0 - MSE Loss: 0.000509
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 414
Step 0 - MSE Loss: 0.000015
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000026
Epoch 415
Step 0 - MSE Loss: 0.000494
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 416
Step 0 - MSE Loss: 0.000012
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000027
Epoch 417
Step 0 - MSE Loss: 0.000550
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000017
Epoch 418
Step 0 - MSE Loss: 0.000003
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000029
Epoch 419
Step 0 - MSE Loss: 0.000630
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 420
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 421
Step 0 - MSE Loss: 0.000649
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 422
Step 0 - MSE Loss: 0.000017
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000030
Epoch 423
Step 0 - MSE Loss: 0.000526
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 424
Step 0 - MSE Loss: 0.000011
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000028
Epoch 425
Step 0 - MSE Loss: 0.000565
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000017
Epoch 426
Step 0 - MSE Loss: 0.000007
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000031
Epoch 427
Step 0 - MSE Loss: 0.000539
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000017
Epoch 428
Step 0 - MSE Loss: 0.000002
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000031
Epoch 429
Step 0 - MSE Loss: 0.000698
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000016
Epoch 430
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000028
Epoch 431
Step 0 - MSE Loss: 0.000714
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000016
Epoch 432
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 433
Step 0 - MSE Loss: 0.000529
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 434
Step 0 - MSE Loss: 0.000035
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000027
Epoch 435
Step 0 - MSE Loss: 0.000443
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 436
Step 0 - MSE Loss: 0.000016
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000027
Epoch 437
Step 0 - MSE Loss: 0.000506
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000016
Epoch 438
Step 0 - MSE Loss: 0.000010
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000027
Epoch 439
Step 0 - MSE Loss: 0.000545
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000016
Epoch 440
Step 0 - MSE Loss: 0.000002
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000023
Epoch 441
Step 0 - MSE Loss: 0.000600
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000016
Epoch 442
Step 0 - MSE Loss: 0.000017
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000024
Epoch 443
Step 0 - MSE Loss: 0.000464
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000015
Epoch 444
Step 0 - MSE Loss: 0.000001
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000024
Epoch 445
Step 0 - MSE Loss: 0.000619
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000015
Epoch 446
Step 0 - MSE Loss: 0.000002
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 447
Step 0 - MSE Loss: 0.000646
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000015
Epoch 448
Step 0 - MSE Loss: 0.000002
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 449
Step 0 - MSE Loss: 0.000630
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000015
Epoch 450
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000026
Epoch 451
Step 0 - MSE Loss: 0.000523
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000016
Epoch 452
Step 0 - MSE Loss: 0.000012
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 453
Step 0 - MSE Loss: 0.000521
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000017
Epoch 454
Step 0 - MSE Loss: 0.000011
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000023
Epoch 455
Step 0 - MSE Loss: 0.000469
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000016
Epoch 456
Step 0 - MSE Loss: 0.000004
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000023
Epoch 457
Step 0 - MSE Loss: 0.000515
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000013
Epoch 458
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000023
Epoch 459
Step 0 - MSE Loss: 0.000564
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000013
Epoch 460
Step 0 - MSE Loss: 0.000003
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 461
Step 0 - MSE Loss: 0.000632
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000014
Epoch 462
Step 0 - MSE Loss: 0.000007
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000023
Epoch 463
Step 0 - MSE Loss: 0.000470
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000014
Epoch 464
Step 0 - MSE Loss: 0.000001
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 465
Step 0 - MSE Loss: 0.000536
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000014
Epoch 466
Step 0 - MSE Loss: 0.000002
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000025
Epoch 467
Step 0 - MSE Loss: 0.000553
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000014
Epoch 468
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 469
Step 0 - MSE Loss: 0.000563
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000015
Epoch 470
Step 0 - MSE Loss: 0.000019
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000022
Epoch 471
Step 0 - MSE Loss: 0.000408
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000014
Epoch 472
Step 0 - MSE Loss: 0.000001
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 473
Step 0 - MSE Loss: 0.000477
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000012
Epoch 474
Step 0 - MSE Loss: 0.000002
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 475
Step 0 - MSE Loss: 0.000418
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000012
Epoch 476
Step 0 - MSE Loss: 0.000007
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 477
Step 0 - MSE Loss: 0.000415
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000013
Epoch 478
Step 0 - MSE Loss: 0.000002
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 479
Step 0 - MSE Loss: 0.000530
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000011
Epoch 480
Step 0 - MSE Loss: 0.000002
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000016
Epoch 481
Step 0 - MSE Loss: 0.000546
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000010
Epoch 482
Step 0 - MSE Loss: 0.000007
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 483
Step 0 - MSE Loss: 0.000489
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000011
Epoch 484
Step 0 - MSE Loss: 0.000006
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000021
Epoch 485
Step 0 - MSE Loss: 0.000536
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000011
Epoch 486
Step 0 - MSE Loss: 0.000014
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 487
Step 0 - MSE Loss: 0.000572
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000010
Epoch 488
Step 0 - MSE Loss: 0.000019
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 489
Step 0 - MSE Loss: 0.000509
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000010
Epoch 490
Step 0 - MSE Loss: 0.000005
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 491
Step 0 - MSE Loss: 0.000523
Step 200 - MSE Loss: 0.000001
Step 400 - MSE Loss: 0.000011
Epoch 492
Step 0 - MSE Loss: 0.000009
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000019
Epoch 493
Step 0 - MSE Loss: 0.000566
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000011
Epoch 494
Step 0 - MSE Loss: 0.000008
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000018
Epoch 495
Step 0 - MSE Loss: 0.000500
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000010
Epoch 496
Step 0 - MSE Loss: 0.000003
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000020
Epoch 497
Step 0 - MSE Loss: 0.000407
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000010
Epoch 498
Step 0 - MSE Loss: 0.000000
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000017
Epoch 499
Step 0 - MSE Loss: 0.000518
Step 200 - MSE Loss: 0.000000
Step 400 - MSE Loss: 0.000010
In [70]:
train_rmse = sum(item**(1/2.0) for item in train_mse_loss_list)/len(train_mse_loss_list)
print("Mean Training Loss (RMSE) is {:.6f}".format(train_rmse))
test_rmse = sum(item**(1/2.0) for item in test_mse_loss_list)/len(test_mse_loss_list)
print("Mean Testing Loss (RMSE) is {:.6f}".format(test_rmse))
Mean Training Loss (RMSE) is 0.004705
Mean Testing Loss (RMSE) is 0.017578

Model Evaluation and Performance

A coefficient of variation (CV) can be calculated and interpreted in two different settings: analyzing a single variable and interpreting a model.

In the modeling setting, the CV is calculated as the ratio of the root mean squared error (RMSE) to the mean of the dependent variable. In both settings, the CV is often presented as the given ratio multiplied by 100.

The CV for a model aims to describe the model fit in terms of the relative sizes of the squared residuals and outcome values. The lower the CV, the smaller the residuals relative to the predicted value. This is suggestive of a good model fit.

The advantage of the CV is that it is unitless. This allows CVs to be compared to each other in ways that other measures, like standard deviations or root mean squared residuals, cannot be.

In the model CV setting: Similarly, the RMSE of two models both measure the magnitude of the residuals, but they cannot be compared to each other in a meaningful way to determine which model provides better predictions of an outcome.

The model RMSE and mean of the predicted variable are expressed in the same units, so taking the ratio of these two allows the units to cancel. This ratio can then be compared to other such ratios in a meaningful way: between two models (where the outcome variable meets the assumptions outlined below), the model with the smaller CV has predicted values that are closer to the actual values.

It is interesting to note the differences between a model’s CV and R-squared values. Both are unitless measures that are indicative of model fit, but they define model fit in two different ways: CV evaluates the relative closeness of the predictions to the actual values while R-squared evaluates how much of the variability in the actual values is explained by the model.

In [74]:
# Calculate the Mean of the predictions
mean_test_pred = sum(test_pred_list)/len(test_pred_list)
# Calculating the Coefficient of Variation of the predictions
cv_rnn = test_rmse/mean_test_pred
print("Coefficient of Variation (in percentage) for the RNN model is {:.6f}".format(cv_rnn*100))
Coefficient of Variation (in percentage) for the RNN model is 5.410810
In [50]:
trace = go.Scatter(
    x = np.arange(0,len(train_mse_loss_list)),
    y = train_mse_loss_list,
    mode = 'markers', 
)

layout = go.Layout(
    title= "Training Loss",  
    xaxis=dict(
    title='epochs',
    ), 
    yaxis=dict(
    title='train loss',
    )  
)

data = go.Data([trace])

fig = go.Figure(data = data, layout = layout)

py.iplot(fig)
In [51]:
avg_cp_trace = go.Scatter(y=yTest, name = 'Average Close Price (USD)', line = dict(color = ('rgb(205, 12, 24)'), width = 4))
pred_avg_cp_trace = go.Scatter(y=test_pred_list, name = 'Predicted Average Close Price (USD)', line = dict(color = ('rgb(22, 96, 167)'), width = 4))
layout = dict(title = 'Average Close Price (USD) vs Predicted Average Close Price (USD)')
fig = dict(data=[avg_cp_trace, pred_avg_cp_trace], layout=layout)
py.iplot(fig)

Second Framework - LSTM trained on Initial Features (Historical Bitcoin Prices) and Secondary Features (Sentiments of Top Articles for each day from 01/07/2014 to 12/12/2017)

You can get a deeper understanding of how I scraped the articles and computed their sentiments in my notebooks titled Google News Scraper and Sentiment Analysis of Top Google News Articles for keyword bitcoin.

Input data

In [52]:
import numpy as np
from sklearn.metrics import mean_squared_error
from sklearn.preprocessing import MinMaxScaler
from keras.models import Sequential
from keras.layers import Dense, Activation, Dropout
from keras.layers import LSTM
Using TensorFlow backend.
In [53]:
df_sentiment = pd.read_csv('bitcoin_news_average_sentiments_2014_2017.csv')
df_sentiment = df_sentiment.drop(["Date"], 1)
df_sentiment.head()
Out[53]:
Average Sentiment Score
0 -0.070
1 -0.150
2 -0.180
3 0.110
4 -0.025
In [54]:
df_cprice_data = pd.read_csv('BTC_USD.csv',usecols=[0,4])
df_cprice_data.set_index("Date")
df_cprice_data.head()
Out[54]:
Date Average Close Price (USD)
0 2014-01-07 795.333333
1 2014-01-08 825.656790
2 2014-01-09 838.876447
3 2014-01-10 848.229587
4 2014-01-11 897.770750
In [55]:
finaldf = pd.concat([df_cprice_data, df_sentiment], axis=1)
finaldf.set_index('Date',inplace=True)
finaldf.head()
Out[55]:
Average Close Price (USD) Average Sentiment Score
Date
2014-01-07 795.333333 -0.070
2014-01-08 825.656790 -0.150
2014-01-09 838.876447 -0.180
2014-01-10 848.229587 0.110
2014-01-11 897.770750 -0.025

Visualizing the feature set values

Variation in Normalized Bitcoin Closing Price with respect to Normalized Daily Average Sentiment Score

In [56]:
avg_cp_trace = go.Scatter(y=dfNorm['Average Close Price (USD)'], x=finaldf.index, name = 'Daily Average Close Price (USD)', line = dict(color = ('rgb(205, 12, 24)'), width = 1))
avg_sentiment_trace = go.Scatter(y=finaldf['Average Sentiment Score'], x=finaldf.index, name = 'Daily Average Sentiment Scores', line = dict(color = ('rgb(22, 96, 167)'), width = 0.5))
layout = dict(title = 'Daily Average Close Price (USD) vs Daily Average Sentiment Score')
fig = dict(data=[avg_cp_trace, avg_sentiment_trace], layout=layout)
py.iplot(fig)

Scaling the data

In [57]:
scaler = MinMaxScaler(feature_range=(0, 1))
scaled = scaler.fit_transform(finaldf.values)

Convert Time-Series Data to Supervised Series to reframe Final Dataframe

In [58]:
def series_to_supervised(data, n_in=1, n_out=1, dropnan=True):
    n_vars = 1 if type(data) is list else data.shape[1]
    df = pd.DataFrame(data)
    cols, names = list(), list()
    # input sequence (t-n, ... t-1)
    for i in range(n_in, 0, -1):
        cols.append(df.shift(i))
        names += [('var{0}(t-{1})'.format (j+1, i)) for j in range(n_vars)]
    # forecast sequence (t, t+1, ... t+n)
    for i in range(0, n_out):
        cols.append(df.shift(-i))
        if i == 0:
            names += [('var{}(t)'.format(j+1)) for j in range(n_vars)]
        else:
            names += [('var{0}(t+{1})'.format(j+1, i)) for j in range(n_vars)]
    # put it all together
    agg = pd.concat(cols, axis=1)
    agg.columns = names
    # drop rows with NaN values
    if dropnan:
        agg.dropna(inplace=True)
    return agg
In [59]:
# Look Back period
n_days = 3
# Number of features
n_features = 2
# Total Observations
n_obs = n_days*n_features
In [60]:
finaldf_reframed = series_to_supervised(scaled, n_days, 1)
finaldf_reframed
Out[60]:
var1(t-3) var2(t-3) var1(t-2) var2(t-2) var1(t-1) var2(t-1) var1(t) var2(t)
3 0.036456 0.550000 0.038242 0.416667 0.039021 0.366667 0.039572 0.850000
4 0.038242 0.416667 0.039021 0.366667 0.039572 0.850000 0.042491 0.625000
5 0.039021 0.366667 0.039572 0.850000 0.042491 0.625000 0.040504 0.916667
6 0.039572 0.850000 0.042491 0.625000 0.040504 0.916667 0.038231 0.633333
7 0.042491 0.625000 0.040504 0.916667 0.038231 0.633333 0.038181 0.566667
8 0.040504 0.916667 0.038231 0.633333 0.038181 0.566667 0.039301 0.516667
9 0.038231 0.633333 0.038181 0.566667 0.039301 0.516667 0.038015 0.583333
10 0.038181 0.566667 0.039301 0.516667 0.038015 0.583333 0.036538 0.683333
11 0.039301 0.516667 0.038015 0.583333 0.036538 0.683333 0.037310 0.481481
12 0.038015 0.583333 0.036538 0.683333 0.037310 0.481481 0.038833 0.833333
13 0.036538 0.683333 0.037310 0.481481 0.038833 0.833333 0.038560 0.583333
14 0.037310 0.481481 0.038833 0.833333 0.038560 0.583333 0.037986 0.650000
15 0.038833 0.833333 0.038560 0.583333 0.037986 0.650000 0.037852 0.616667
16 0.038560 0.583333 0.037986 0.650000 0.037852 0.616667 0.037891 0.633333
17 0.037986 0.650000 0.037852 0.616667 0.037891 0.633333 0.036041 0.666667
18 0.037852 0.616667 0.037891 0.633333 0.036041 0.666667 0.037297 0.666667
19 0.037891 0.633333 0.036041 0.666667 0.037297 0.666667 0.037735 0.633333
20 0.036041 0.666667 0.037297 0.666667 0.037735 0.633333 0.034615 0.516667
21 0.037297 0.666667 0.037735 0.633333 0.034615 0.516667 0.036711 0.633333
22 0.037735 0.633333 0.034615 0.516667 0.036711 0.633333 0.036519 0.733333
23 0.034615 0.516667 0.036711 0.633333 0.036519 0.733333 0.036965 0.683333
24 0.036711 0.633333 0.036519 0.733333 0.036965 0.683333 0.037029 0.633333
25 0.036519 0.733333 0.036965 0.683333 0.037029 0.633333 0.037769 0.666667
26 0.036965 0.683333 0.037029 0.633333 0.037769 0.666667 0.037610 0.611111
27 0.037029 0.633333 0.037769 0.666667 0.037610 0.611111 0.037483 0.766667
28 0.037769 0.666667 0.037610 0.611111 0.037483 0.766667 0.037092 0.583333
29 0.037610 0.611111 0.037483 0.766667 0.037092 0.583333 0.036256 0.550000
30 0.037483 0.766667 0.037092 0.583333 0.036256 0.550000 0.034829 0.683333
31 0.037092 0.583333 0.036256 0.550000 0.034829 0.683333 0.031608 0.550000
32 0.036256 0.550000 0.034829 0.683333 0.031608 0.550000 0.030246 0.407407
... ... ... ... ... ... ... ... ...
1406 0.377258 0.300000 0.363446 0.450000 0.335612 0.616667 0.374197 0.650000
1407 0.363446 0.450000 0.335612 0.616667 0.374197 0.650000 0.379586 0.766667
1408 0.335612 0.616667 0.374197 0.650000 0.379586 0.766667 0.418789 0.483333
1409 0.374197 0.650000 0.379586 0.766667 0.418789 0.483333 0.450269 0.416667
1410 0.379586 0.766667 0.418789 0.483333 0.450269 0.416667 0.442873 0.566667
1411 0.418789 0.483333 0.450269 0.416667 0.442873 0.566667 0.447855 0.533333
1412 0.450269 0.416667 0.442873 0.566667 0.447855 0.533333 0.462222 0.650000
1413 0.442873 0.566667 0.447855 0.533333 0.462222 0.650000 0.474869 0.600000
1414 0.447855 0.533333 0.462222 0.650000 0.474869 0.600000 0.467025 0.583333
1415 0.462222 0.650000 0.474869 0.600000 0.467025 0.583333 0.474888 0.433333
1416 0.474869 0.600000 0.467025 0.583333 0.474888 0.433333 0.461790 0.666667
1417 0.467025 0.583333 0.474888 0.433333 0.461790 0.666667 0.472577 0.616667
1418 0.474888 0.433333 0.461790 0.666667 0.472577 0.616667 0.504919 0.666667
1419 0.461790 0.666667 0.472577 0.616667 0.504919 0.666667 0.537206 0.650000
1420 0.472577 0.616667 0.504919 0.666667 0.537206 0.650000 0.561229 0.733333
1421 0.504919 0.666667 0.537206 0.650000 0.561229 0.733333 0.572161 0.633333
1422 0.537206 0.650000 0.561229 0.733333 0.572161 0.633333 0.569546 0.583333
1423 0.561229 0.733333 0.572161 0.633333 0.569546 0.583333 0.576979 0.650000
1424 0.572161 0.633333 0.569546 0.583333 0.576979 0.650000 0.630000 0.500000
1425 0.569546 0.583333 0.576979 0.650000 0.630000 0.500000 0.631798 0.518519
1426 0.576979 0.650000 0.630000 0.500000 0.631798 0.518519 0.651636 0.516667
1427 0.630000 0.500000 0.631798 0.518519 0.651636 0.516667 0.674760 0.483333
1428 0.631798 0.518519 0.651636 0.516667 0.674760 0.483333 0.679250 0.425926
1429 0.651636 0.516667 0.674760 0.483333 0.679250 0.425926 0.801173 0.483333
1430 0.674760 0.483333 0.679250 0.425926 0.801173 0.483333 0.971557 0.566667
1431 0.679250 0.425926 0.801173 0.483333 0.971557 0.566667 0.935857 0.425926
1432 0.801173 0.483333 0.971557 0.566667 0.935857 0.425926 0.868335 0.666667
1433 0.971557 0.566667 0.935857 0.425926 0.868335 0.666667 0.875257 0.683333
1434 0.935857 0.425926 0.868335 0.666667 0.875257 0.683333 0.974050 0.574074
1435 0.868335 0.666667 0.875257 0.683333 0.974050 0.574074 1.000000 0.583333

1433 rows × 8 columns

Train and Test Data

In [61]:
values = finaldf_reframed.values
n_train_days = 1300
train = values[:n_train_days, :]
test = values[n_train_days:, :]
train.shape
Out[61]:
(1300, 8)
In [62]:
# split into input and outputs
train_X, train_y = train[:, :n_obs], train[:, -n_features]
test_X, test_y = test[:, :n_obs], test[:, -n_features]
In [63]:
# reshape input to be 3D [samples, timesteps, features]
train_X = train_X.reshape((train_X.shape[0], n_days, n_features))
test_X = test_X.reshape((test_X.shape[0], n_days, n_features))
print(train_X.shape, train_y.shape, test_X.shape, test_y.shape)
(1300, 3, 2) (1300,) (133, 3, 2) (133,)

LSTM Model

Network Design

In [64]:
model = Sequential()
model.add(LSTM(5, input_shape=(train_X.shape[1], train_X.shape[2])))
model.add(Dropout(0.2))
model.add(Dense(1))
model.add(Activation('linear'))
model.compile(loss='mae', optimizer='adam')

Fit Model and Train

In [65]:
history = model.fit(train_X, train_y, epochs=500, batch_size=1, validation_data=(test_X, test_y), verbose=2, shuffle=False)
Train on 1300 samples, validate on 133 samples
Epoch 1/500
 - 8s - loss: 0.0059 - val_loss: 0.1860
Epoch 2/500
 - 7s - loss: 0.0059 - val_loss: 0.1821
Epoch 3/500
 - 6s - loss: 0.0054 - val_loss: 0.1904
Epoch 4/500
 - 6s - loss: 0.0052 - val_loss: 0.1748
Epoch 5/500
 - 6s - loss: 0.0053 - val_loss: 0.1783
Epoch 6/500
 - 6s - loss: 0.0053 - val_loss: 0.1760
Epoch 7/500
 - 6s - loss: 0.0057 - val_loss: 0.1769
Epoch 8/500
 - 6s - loss: 0.0057 - val_loss: 0.1863
Epoch 9/500
 - 6s - loss: 0.0055 - val_loss: 0.1806
Epoch 10/500
 - 6s - loss: 0.0056 - val_loss: 0.1671
Epoch 11/500
 - 6s - loss: 0.0061 - val_loss: 0.1776
Epoch 12/500
 - 6s - loss: 0.0056 - val_loss: 0.1727
Epoch 13/500
 - 6s - loss: 0.0059 - val_loss: 0.1637
Epoch 14/500
 - 6s - loss: 0.0057 - val_loss: 0.1744
Epoch 15/500
 - 6s - loss: 0.0055 - val_loss: 0.1618
Epoch 16/500
 - 6s - loss: 0.0057 - val_loss: 0.1571
Epoch 17/500
 - 6s - loss: 0.0055 - val_loss: 0.1648
Epoch 18/500
 - 6s - loss: 0.0056 - val_loss: 0.1679
Epoch 19/500
 - 6s - loss: 0.0051 - val_loss: 0.1451
Epoch 20/500
 - 6s - loss: 0.0056 - val_loss: 0.1572
Epoch 21/500
 - 6s - loss: 0.0054 - val_loss: 0.1566
Epoch 22/500
 - 6s - loss: 0.0055 - val_loss: 0.1439
Epoch 23/500
 - 6s - loss: 0.0058 - val_loss: 0.1510
Epoch 24/500
 - 6s - loss: 0.0054 - val_loss: 0.1484
Epoch 25/500
 - 6s - loss: 0.0048 - val_loss: 0.1356
Epoch 26/500
 - 6s - loss: 0.0050 - val_loss: 0.1409
Epoch 27/500
 - 7s - loss: 0.0053 - val_loss: 0.1394
Epoch 28/500
 - 6s - loss: 0.0052 - val_loss: 0.1226
Epoch 29/500
 - 6s - loss: 0.0052 - val_loss: 0.1368
Epoch 30/500
 - 6s - loss: 0.0050 - val_loss: 0.1331
Epoch 31/500
 - 6s - loss: 0.0053 - val_loss: 0.1317
Epoch 32/500
 - 6s - loss: 0.0052 - val_loss: 0.1195
Epoch 33/500
 - 6s - loss: 0.0050 - val_loss: 0.1099
Epoch 34/500
 - 6s - loss: 0.0047 - val_loss: 0.1154
Epoch 35/500
 - 6s - loss: 0.0049 - val_loss: 0.1020
Epoch 36/500
 - 6s - loss: 0.0048 - val_loss: 0.1015
Epoch 37/500
 - 6s - loss: 0.0050 - val_loss: 0.1015
Epoch 38/500
 - 6s - loss: 0.0049 - val_loss: 0.0934
Epoch 39/500
 - 6s - loss: 0.0051 - val_loss: 0.1130
Epoch 40/500
 - 6s - loss: 0.0045 - val_loss: 0.0985
Epoch 41/500
 - 6s - loss: 0.0050 - val_loss: 0.0934
Epoch 42/500
 - 6s - loss: 0.0048 - val_loss: 0.0992
Epoch 43/500
 - 6s - loss: 0.0046 - val_loss: 0.0794
Epoch 44/500
 - 6s - loss: 0.0046 - val_loss: 0.0891
Epoch 45/500
 - 6s - loss: 0.0045 - val_loss: 0.0806
Epoch 46/500
 - 6s - loss: 0.0046 - val_loss: 0.0993
Epoch 47/500
 - 8s - loss: 0.0048 - val_loss: 0.1004
Epoch 48/500
 - 6s - loss: 0.0047 - val_loss: 0.0934
Epoch 49/500
 - 6s - loss: 0.0048 - val_loss: 0.0875
Epoch 50/500
 - 6s - loss: 0.0049 - val_loss: 0.0716
Epoch 51/500
 - 6s - loss: 0.0048 - val_loss: 0.0736
Epoch 52/500
 - 6s - loss: 0.0045 - val_loss: 0.0695
Epoch 53/500
 - 6s - loss: 0.0042 - val_loss: 0.0602
Epoch 54/500
 - 6s - loss: 0.0046 - val_loss: 0.0661
Epoch 55/500
 - 6s - loss: 0.0045 - val_loss: 0.0582
Epoch 56/500
 - 6s - loss: 0.0046 - val_loss: 0.0464
Epoch 57/500
 - 6s - loss: 0.0044 - val_loss: 0.0313
Epoch 58/500
 - 6s - loss: 0.0046 - val_loss: 0.0412
Epoch 59/500
 - 6s - loss: 0.0042 - val_loss: 0.0395
Epoch 60/500
 - 6s - loss: 0.0041 - val_loss: 0.0429
Epoch 61/500
 - 6s - loss: 0.0047 - val_loss: 0.0367
Epoch 62/500
 - 6s - loss: 0.0042 - val_loss: 0.0658
Epoch 63/500
 - 6s - loss: 0.0043 - val_loss: 0.0453
Epoch 64/500
 - 6s - loss: 0.0039 - val_loss: 0.0615
Epoch 65/500
 - 6s - loss: 0.0046 - val_loss: 0.0665
Epoch 66/500
 - 6s - loss: 0.0042 - val_loss: 0.0497
Epoch 67/500
 - 6s - loss: 0.0043 - val_loss: 0.0581
Epoch 68/500
 - 6s - loss: 0.0046 - val_loss: 0.0405
Epoch 69/500
 - 6s - loss: 0.0047 - val_loss: 0.0445
Epoch 70/500
 - 6s - loss: 0.0041 - val_loss: 0.0427
Epoch 71/500
 - 6s - loss: 0.0040 - val_loss: 0.0312
Epoch 72/500
 - 6s - loss: 0.0042 - val_loss: 0.0376
Epoch 73/500
 - 6s - loss: 0.0044 - val_loss: 0.0317
Epoch 74/500
 - 6s - loss: 0.0045 - val_loss: 0.0423
Epoch 75/500
 - 6s - loss: 0.0041 - val_loss: 0.0486
Epoch 76/500
 - 6s - loss: 0.0042 - val_loss: 0.0560
Epoch 77/500
 - 6s - loss: 0.0041 - val_loss: 0.0318
Epoch 78/500
 - 7s - loss: 0.0045 - val_loss: 0.0316
Epoch 79/500
 - 6s - loss: 0.0040 - val_loss: 0.0245
Epoch 80/500
 - 6s - loss: 0.0044 - val_loss: 0.0429
Epoch 81/500
 - 6s - loss: 0.0037 - val_loss: 0.0316
Epoch 82/500
 - 6s - loss: 0.0043 - val_loss: 0.0479
Epoch 83/500
 - 6s - loss: 0.0041 - val_loss: 0.0415
Epoch 84/500
 - 7s - loss: 0.0039 - val_loss: 0.0567
Epoch 85/500
 - 6s - loss: 0.0040 - val_loss: 0.0533
Epoch 86/500
 - 6s - loss: 0.0040 - val_loss: 0.0545
Epoch 87/500
 - 6s - loss: 0.0036 - val_loss: 0.0637
Epoch 88/500
 - 6s - loss: 0.0041 - val_loss: 0.0594
Epoch 89/500
 - 6s - loss: 0.0040 - val_loss: 0.0673
Epoch 90/500
 - 6s - loss: 0.0042 - val_loss: 0.0774
Epoch 91/500
 - 6s - loss: 0.0040 - val_loss: 0.0769
Epoch 92/500
 - 7s - loss: 0.0041 - val_loss: 0.0704
Epoch 93/500
 - 7s - loss: 0.0039 - val_loss: 0.0703
Epoch 94/500
 - 6s - loss: 0.0042 - val_loss: 0.0796
Epoch 95/500
 - 6s - loss: 0.0041 - val_loss: 0.0691
Epoch 96/500
 - 7s - loss: 0.0040 - val_loss: 0.0842
Epoch 97/500
 - 6s - loss: 0.0042 - val_loss: 0.0733
Epoch 98/500
 - 6s - loss: 0.0040 - val_loss: 0.0783
Epoch 99/500
 - 6s - loss: 0.0040 - val_loss: 0.0580
Epoch 100/500
 - 6s - loss: 0.0039 - val_loss: 0.0630
Epoch 101/500
 - 6s - loss: 0.0040 - val_loss: 0.0879
Epoch 102/500
 - 6s - loss: 0.0038 - val_loss: 0.0783
Epoch 103/500
 - 6s - loss: 0.0037 - val_loss: 0.0929
Epoch 104/500
 - 6s - loss: 0.0039 - val_loss: 0.0861
Epoch 105/500
 - 6s - loss: 0.0037 - val_loss: 0.0903
Epoch 106/500
 - 6s - loss: 0.0041 - val_loss: 0.0755
Epoch 107/500
 - 6s - loss: 0.0038 - val_loss: 0.0707
Epoch 108/500
 - 6s - loss: 0.0039 - val_loss: 0.0741
Epoch 109/500
 - 6s - loss: 0.0037 - val_loss: 0.0748
Epoch 110/500
 - 6s - loss: 0.0040 - val_loss: 0.0922
Epoch 111/500
 - 6s - loss: 0.0036 - val_loss: 0.0728
Epoch 112/500
 - 6s - loss: 0.0044 - val_loss: 0.0913
Epoch 113/500
 - 6s - loss: 0.0042 - val_loss: 0.0735
Epoch 114/500
 - 6s - loss: 0.0038 - val_loss: 0.0549
Epoch 115/500
 - 6s - loss: 0.0038 - val_loss: 0.0704
Epoch 116/500
 - 6s - loss: 0.0040 - val_loss: 0.0712
Epoch 117/500
 - 6s - loss: 0.0038 - val_loss: 0.0827
Epoch 118/500
 - 6s - loss: 0.0040 - val_loss: 0.0762
Epoch 119/500
 - 6s - loss: 0.0040 - val_loss: 0.0771
Epoch 120/500
 - 6s - loss: 0.0040 - val_loss: 0.0772
Epoch 121/500
 - 6s - loss: 0.0042 - val_loss: 0.0827
Epoch 122/500
 - 6s - loss: 0.0037 - val_loss: 0.0975
Epoch 123/500
 - 6s - loss: 0.0041 - val_loss: 0.0911
Epoch 124/500
 - 6s - loss: 0.0039 - val_loss: 0.1079
Epoch 125/500
 - 7s - loss: 0.0036 - val_loss: 0.1089
Epoch 126/500
 - 7s - loss: 0.0039 - val_loss: 0.1233
Epoch 127/500
 - 7s - loss: 0.0038 - val_loss: 0.1244
Epoch 128/500
 - 6s - loss: 0.0037 - val_loss: 0.1419
Epoch 129/500
 - 6s - loss: 0.0040 - val_loss: 0.1356
Epoch 130/500
 - 6s - loss: 0.0038 - val_loss: 0.1297
Epoch 131/500
 - 8s - loss: 0.0039 - val_loss: 0.1328
Epoch 132/500
 - 6s - loss: 0.0039 - val_loss: 0.1534
Epoch 133/500
 - 7s - loss: 0.0043 - val_loss: 0.1177
Epoch 134/500
 - 8s - loss: 0.0040 - val_loss: 0.1088
Epoch 135/500
 - 6s - loss: 0.0039 - val_loss: 0.1149
Epoch 136/500
 - 7s - loss: 0.0039 - val_loss: 0.1065
Epoch 137/500
 - 7s - loss: 0.0039 - val_loss: 0.1076
Epoch 138/500
 - 6s - loss: 0.0039 - val_loss: 0.1264
Epoch 139/500
 - 6s - loss: 0.0039 - val_loss: 0.1070
Epoch 140/500
 - 6s - loss: 0.0041 - val_loss: 0.1248
Epoch 141/500
 - 6s - loss: 0.0037 - val_loss: 0.1272
Epoch 142/500
 - 6s - loss: 0.0042 - val_loss: 0.1485
Epoch 143/500
 - 6s - loss: 0.0042 - val_loss: 0.1330
Epoch 144/500
 - 6s - loss: 0.0040 - val_loss: 0.1218
Epoch 145/500
 - 6s - loss: 0.0038 - val_loss: 0.1359
Epoch 146/500
 - 7s - loss: 0.0040 - val_loss: 0.1229
Epoch 147/500
 - 6s - loss: 0.0037 - val_loss: 0.1125
Epoch 148/500
 - 7s - loss: 0.0042 - val_loss: 0.1254
Epoch 149/500
 - 7s - loss: 0.0041 - val_loss: 0.1389
Epoch 150/500
 - 6s - loss: 0.0036 - val_loss: 0.1365
Epoch 151/500
 - 7s - loss: 0.0040 - val_loss: 0.1573
Epoch 152/500
 - 6s - loss: 0.0039 - val_loss: 0.1586
Epoch 153/500
 - 6s - loss: 0.0037 - val_loss: 0.1795
Epoch 154/500
 - 6s - loss: 0.0042 - val_loss: 0.1961
Epoch 155/500
 - 7s - loss: 0.0040 - val_loss: 0.1784
Epoch 156/500
 - 6s - loss: 0.0039 - val_loss: 0.1647
Epoch 157/500
 - 6s - loss: 0.0041 - val_loss: 0.1589
Epoch 158/500
 - 6s - loss: 0.0037 - val_loss: 0.1844
Epoch 159/500
 - 6s - loss: 0.0039 - val_loss: 0.1766
Epoch 160/500
 - 6s - loss: 0.0042 - val_loss: 0.1831
Epoch 161/500
 - 6s - loss: 0.0038 - val_loss: 0.1772
Epoch 162/500
 - 6s - loss: 0.0040 - val_loss: 0.1833
Epoch 163/500
 - 6s - loss: 0.0038 - val_loss: 0.1766
Epoch 164/500
 - 6s - loss: 0.0037 - val_loss: 0.1921
Epoch 165/500
 - 6s - loss: 0.0037 - val_loss: 0.2101
Epoch 166/500
 - 6s - loss: 0.0039 - val_loss: 0.2001
Epoch 167/500
 - 6s - loss: 0.0038 - val_loss: 0.2137
Epoch 168/500
 - 6s - loss: 0.0039 - val_loss: 0.2094
Epoch 169/500
 - 7s - loss: 0.0037 - val_loss: 0.2257
Epoch 170/500
 - 6s - loss: 0.0039 - val_loss: 0.2154
Epoch 171/500
 - 8s - loss: 0.0037 - val_loss: 0.2254
Epoch 172/500
 - 6s - loss: 0.0035 - val_loss: 0.2320
Epoch 173/500
 - 6s - loss: 0.0037 - val_loss: 0.2462
Epoch 174/500
 - 6s - loss: 0.0037 - val_loss: 0.2367
Epoch 175/500
 - 8s - loss: 0.0035 - val_loss: 0.2447
Epoch 176/500
 - 7s - loss: 0.0042 - val_loss: 0.2471
Epoch 177/500
 - 7s - loss: 0.0039 - val_loss: 0.2265
Epoch 178/500
 - 7s - loss: 0.0040 - val_loss: 0.2326
Epoch 179/500
 - 7s - loss: 0.0038 - val_loss: 0.2149
Epoch 180/500
 - 6s - loss: 0.0042 - val_loss: 0.2152
Epoch 181/500
 - 6s - loss: 0.0036 - val_loss: 0.2263
Epoch 182/500
 - 6s - loss: 0.0037 - val_loss: 0.2101
Epoch 183/500
 - 6s - loss: 0.0033 - val_loss: 0.2206
Epoch 184/500
 - 7s - loss: 0.0038 - val_loss: 0.2174
Epoch 185/500
 - 7s - loss: 0.0038 - val_loss: 0.2209
Epoch 186/500
 - 6s - loss: 0.0034 - val_loss: 0.2359
Epoch 187/500
 - 7s - loss: 0.0037 - val_loss: 0.2336
Epoch 188/500
 - 6s - loss: 0.0036 - val_loss: 0.2555
Epoch 189/500
 - 7s - loss: 0.0036 - val_loss: 0.2654
Epoch 190/500
 - 6s - loss: 0.0038 - val_loss: 0.2592
Epoch 191/500
 - 7s - loss: 0.0035 - val_loss: 0.2453
Epoch 192/500
 - 6s - loss: 0.0036 - val_loss: 0.2539
Epoch 193/500
 - 6s - loss: 0.0038 - val_loss: 0.2493
Epoch 194/500
 - 6s - loss: 0.0036 - val_loss: 0.2432
Epoch 195/500
 - 6s - loss: 0.0039 - val_loss: 0.2543
Epoch 196/500
 - 6s - loss: 0.0036 - val_loss: 0.2422
Epoch 197/500
 - 6s - loss: 0.0040 - val_loss: 0.2593
Epoch 198/500
 - 6s - loss: 0.0040 - val_loss: 0.2589
Epoch 199/500
 - 6s - loss: 0.0036 - val_loss: 0.2522
Epoch 200/500
 - 6s - loss: 0.0038 - val_loss: 0.2451
Epoch 201/500
 - 6s - loss: 0.0035 - val_loss: 0.2496
Epoch 202/500
 - 6s - loss: 0.0036 - val_loss: 0.2439
Epoch 203/500
 - 6s - loss: 0.0042 - val_loss: 0.2360
Epoch 204/500
 - 6s - loss: 0.0035 - val_loss: 0.2126
Epoch 205/500
 - 6s - loss: 0.0040 - val_loss: 0.2235
Epoch 206/500
 - 7s - loss: 0.0038 - val_loss: 0.2239
Epoch 207/500
 - 6s - loss: 0.0038 - val_loss: 0.2335
Epoch 208/500
 - 6s - loss: 0.0032 - val_loss: 0.2253
Epoch 209/500
 - 6s - loss: 0.0039 - val_loss: 0.2437
Epoch 210/500
 - 6s - loss: 0.0037 - val_loss: 0.2383
Epoch 211/500
 - 6s - loss: 0.0039 - val_loss: 0.2319
Epoch 212/500
 - 6s - loss: 0.0036 - val_loss: 0.2391
Epoch 213/500
 - 6s - loss: 0.0033 - val_loss: 0.2373
Epoch 214/500
 - 6s - loss: 0.0038 - val_loss: 0.2343
Epoch 215/500
 - 6s - loss: 0.0038 - val_loss: 0.2275
Epoch 216/500
 - 6s - loss: 0.0036 - val_loss: 0.2316
Epoch 217/500
 - 6s - loss: 0.0035 - val_loss: 0.2408
Epoch 218/500
 - 6s - loss: 0.0035 - val_loss: 0.2425
Epoch 219/500
 - 6s - loss: 0.0040 - val_loss: 0.2398
Epoch 220/500
 - 6s - loss: 0.0039 - val_loss: 0.2370
Epoch 221/500
 - 6s - loss: 0.0038 - val_loss: 0.2364
Epoch 222/500
 - 6s - loss: 0.0033 - val_loss: 0.2224
Epoch 223/500
 - 6s - loss: 0.0037 - val_loss: 0.2148
Epoch 224/500
 - 6s - loss: 0.0037 - val_loss: 0.2177
Epoch 225/500
 - 6s - loss: 0.0040 - val_loss: 0.2148
Epoch 226/500
 - 6s - loss: 0.0039 - val_loss: 0.2020
Epoch 227/500
 - 6s - loss: 0.0038 - val_loss: 0.2111
Epoch 228/500
 - 6s - loss: 0.0036 - val_loss: 0.1888
Epoch 229/500
 - 6s - loss: 0.0041 - val_loss: 0.1863
Epoch 230/500
 - 6s - loss: 0.0035 - val_loss: 0.1918
Epoch 231/500
 - 6s - loss: 0.0035 - val_loss: 0.1746
Epoch 232/500
 - 6s - loss: 0.0037 - val_loss: 0.1710
Epoch 233/500
 - 6s - loss: 0.0039 - val_loss: 0.1660
Epoch 234/500
 - 7s - loss: 0.0037 - val_loss: 0.1577
Epoch 235/500
 - 6s - loss: 0.0040 - val_loss: 0.1591
Epoch 236/500
 - 6s - loss: 0.0038 - val_loss: 0.1557
Epoch 237/500
 - 6s - loss: 0.0035 - val_loss: 0.1684
Epoch 238/500
 - 7s - loss: 0.0036 - val_loss: 0.1548
Epoch 239/500
 - 6s - loss: 0.0035 - val_loss: 0.1485
Epoch 240/500
 - 6s - loss: 0.0038 - val_loss: 0.1382
Epoch 241/500
 - 6s - loss: 0.0039 - val_loss: 0.1457
Epoch 242/500
 - 6s - loss: 0.0036 - val_loss: 0.1398
Epoch 243/500
 - 6s - loss: 0.0038 - val_loss: 0.1425
Epoch 244/500
 - 6s - loss: 0.0035 - val_loss: 0.1403
Epoch 245/500
 - 6s - loss: 0.0035 - val_loss: 0.1309
Epoch 246/500
 - 6s - loss: 0.0035 - val_loss: 0.1251
Epoch 247/500
 - 6s - loss: 0.0037 - val_loss: 0.1223
Epoch 248/500
 - 6s - loss: 0.0037 - val_loss: 0.1159
Epoch 249/500
 - 6s - loss: 0.0038 - val_loss: 0.1197
Epoch 250/500
 - 6s - loss: 0.0035 - val_loss: 0.1115
Epoch 251/500
 - 6s - loss: 0.0036 - val_loss: 0.1087
Epoch 252/500
 - 7s - loss: 0.0039 - val_loss: 0.1073
Epoch 253/500
 - 6s - loss: 0.0041 - val_loss: 0.1090
Epoch 254/500
 - 6s - loss: 0.0035 - val_loss: 0.1010
Epoch 255/500
 - 6s - loss: 0.0038 - val_loss: 0.1049
Epoch 256/500
 - 6s - loss: 0.0037 - val_loss: 0.1032
Epoch 257/500
 - 6s - loss: 0.0037 - val_loss: 0.0983
Epoch 258/500
 - 6s - loss: 0.0036 - val_loss: 0.0955
Epoch 259/500
 - 6s - loss: 0.0034 - val_loss: 0.0905
Epoch 260/500
 - 6s - loss: 0.0038 - val_loss: 0.0858
Epoch 261/500
 - 6s - loss: 0.0037 - val_loss: 0.0889
Epoch 262/500
 - 6s - loss: 0.0035 - val_loss: 0.0851
Epoch 263/500
 - 6s - loss: 0.0036 - val_loss: 0.0785
Epoch 264/500
 - 6s - loss: 0.0038 - val_loss: 0.0809
Epoch 265/500
 - 6s - loss: 0.0038 - val_loss: 0.0792
Epoch 266/500
 - 6s - loss: 0.0035 - val_loss: 0.0750
Epoch 267/500
 - 6s - loss: 0.0034 - val_loss: 0.0719
Epoch 268/500
 - 6s - loss: 0.0036 - val_loss: 0.0733
Epoch 269/500
 - 6s - loss: 0.0035 - val_loss: 0.0719
Epoch 270/500
 - 6s - loss: 0.0036 - val_loss: 0.0711
Epoch 271/500
 - 6s - loss: 0.0034 - val_loss: 0.0722
Epoch 272/500
 - 6s - loss: 0.0034 - val_loss: 0.0702
Epoch 273/500
 - 6s - loss: 0.0034 - val_loss: 0.0641
Epoch 274/500
 - 6s - loss: 0.0036 - val_loss: 0.0663
Epoch 275/500
 - 6s - loss: 0.0036 - val_loss: 0.0682
Epoch 276/500
 - 6s - loss: 0.0036 - val_loss: 0.0636
Epoch 277/500
 - 6s - loss: 0.0034 - val_loss: 0.0610
Epoch 278/500
 - 6s - loss: 0.0039 - val_loss: 0.0662
Epoch 279/500
 - 6s - loss: 0.0037 - val_loss: 0.0647
Epoch 280/500
 - 6s - loss: 0.0037 - val_loss: 0.0745
Epoch 281/500
 - 6s - loss: 0.0035 - val_loss: 0.0767
Epoch 282/500
 - 6s - loss: 0.0033 - val_loss: 0.0638
Epoch 283/500
 - 6s - loss: 0.0038 - val_loss: 0.0668
Epoch 284/500
 - 6s - loss: 0.0037 - val_loss: 0.0646
Epoch 285/500
 - 6s - loss: 0.0037 - val_loss: 0.0660
Epoch 286/500
 - 6s - loss: 0.0036 - val_loss: 0.0694
Epoch 287/500
 - 6s - loss: 0.0035 - val_loss: 0.0679
Epoch 288/500
 - 6s - loss: 0.0040 - val_loss: 0.0703
Epoch 289/500
 - 6s - loss: 0.0033 - val_loss: 0.0742
Epoch 290/500
 - 6s - loss: 0.0036 - val_loss: 0.0615
Epoch 291/500
 - 6s - loss: 0.0034 - val_loss: 0.0649
Epoch 292/500
 - 6s - loss: 0.0033 - val_loss: 0.0637
Epoch 293/500
 - 6s - loss: 0.0034 - val_loss: 0.0556
Epoch 294/500
 - 6s - loss: 0.0035 - val_loss: 0.0555
Epoch 295/500
 - 6s - loss: 0.0033 - val_loss: 0.0540
Epoch 296/500
 - 6s - loss: 0.0038 - val_loss: 0.0536
Epoch 297/500
 - 6s - loss: 0.0035 - val_loss: 0.0619
Epoch 298/500
 - 8s - loss: 0.0033 - val_loss: 0.0605
Epoch 299/500
 - 8s - loss: 0.0034 - val_loss: 0.0616
Epoch 300/500
 - 6s - loss: 0.0034 - val_loss: 0.0586
Epoch 301/500
 - 7s - loss: 0.0034 - val_loss: 0.0607
Epoch 302/500
 - 8s - loss: 0.0035 - val_loss: 0.0666
Epoch 303/500
 - 8s - loss: 0.0038 - val_loss: 0.0678
Epoch 304/500
 - 8s - loss: 0.0034 - val_loss: 0.0681
Epoch 305/500
 - 7s - loss: 0.0036 - val_loss: 0.0794
Epoch 306/500
 - 6s - loss: 0.0035 - val_loss: 0.0842
Epoch 307/500
 - 6s - loss: 0.0035 - val_loss: 0.0912
Epoch 308/500
 - 6s - loss: 0.0037 - val_loss: 0.1009
Epoch 309/500
 - 6s - loss: 0.0032 - val_loss: 0.0892
Epoch 310/500
 - 6s - loss: 0.0039 - val_loss: 0.0896
Epoch 311/500
 - 6s - loss: 0.0035 - val_loss: 0.0659
Epoch 312/500
 - 6s - loss: 0.0038 - val_loss: 0.0708
Epoch 313/500
 - 6s - loss: 0.0033 - val_loss: 0.0738
Epoch 314/500
 - 6s - loss: 0.0035 - val_loss: 0.0745
Epoch 315/500
 - 6s - loss: 0.0032 - val_loss: 0.0710
Epoch 316/500
 - 6s - loss: 0.0037 - val_loss: 0.0686
Epoch 317/500
 - 6s - loss: 0.0039 - val_loss: 0.0781
Epoch 318/500
 - 6s - loss: 0.0035 - val_loss: 0.0752
Epoch 319/500
 - 6s - loss: 0.0036 - val_loss: 0.0712
Epoch 320/500
 - 6s - loss: 0.0034 - val_loss: 0.0737
Epoch 321/500
 - 6s - loss: 0.0033 - val_loss: 0.0867
Epoch 322/500
 - 6s - loss: 0.0035 - val_loss: 0.0896
Epoch 323/500
 - 6s - loss: 0.0035 - val_loss: 0.0863
Epoch 324/500
 - 6s - loss: 0.0037 - val_loss: 0.0873
Epoch 325/500
 - 6s - loss: 0.0032 - val_loss: 0.0936
Epoch 326/500
 - 6s - loss: 0.0033 - val_loss: 0.0938
Epoch 327/500
 - 6s - loss: 0.0033 - val_loss: 0.0864
Epoch 328/500
 - 6s - loss: 0.0035 - val_loss: 0.0854
Epoch 329/500
 - 6s - loss: 0.0034 - val_loss: 0.0834
Epoch 330/500
 - 6s - loss: 0.0034 - val_loss: 0.0863
Epoch 331/500
 - 6s - loss: 0.0033 - val_loss: 0.0915
Epoch 332/500
 - 6s - loss: 0.0035 - val_loss: 0.0813
Epoch 333/500
 - 6s - loss: 0.0037 - val_loss: 0.0755
Epoch 334/500
 - 6s - loss: 0.0035 - val_loss: 0.0789
Epoch 335/500
 - 6s - loss: 0.0034 - val_loss: 0.0916
Epoch 336/500
 - 6s - loss: 0.0034 - val_loss: 0.0944
Epoch 337/500
 - 6s - loss: 0.0035 - val_loss: 0.0909
Epoch 338/500
 - 6s - loss: 0.0038 - val_loss: 0.0942
Epoch 339/500
 - 7s - loss: 0.0035 - val_loss: 0.0795
Epoch 340/500
 - 6s - loss: 0.0036 - val_loss: 0.0752
Epoch 341/500
 - 7s - loss: 0.0033 - val_loss: 0.0801
Epoch 342/500
 - 6s - loss: 0.0036 - val_loss: 0.0788
Epoch 343/500
 - 6s - loss: 0.0035 - val_loss: 0.0744
Epoch 344/500
 - 6s - loss: 0.0036 - val_loss: 0.0800
Epoch 345/500
 - 6s - loss: 0.0032 - val_loss: 0.0759
Epoch 346/500
 - 6s - loss: 0.0033 - val_loss: 0.0764
Epoch 347/500
 - 6s - loss: 0.0037 - val_loss: 0.0772
Epoch 348/500
 - 7s - loss: 0.0033 - val_loss: 0.0796
Epoch 349/500
 - 6s - loss: 0.0038 - val_loss: 0.0803
Epoch 350/500
 - 6s - loss: 0.0035 - val_loss: 0.0794
Epoch 351/500
 - 6s - loss: 0.0032 - val_loss: 0.0818
Epoch 352/500
 - 6s - loss: 0.0034 - val_loss: 0.0846
Epoch 353/500
 - 6s - loss: 0.0034 - val_loss: 0.0911
Epoch 354/500
 - 7s - loss: 0.0035 - val_loss: 0.0872
Epoch 355/500
 - 6s - loss: 0.0036 - val_loss: 0.1062
Epoch 356/500
 - 6s - loss: 0.0034 - val_loss: 0.0840
Epoch 357/500
 - 6s - loss: 0.0036 - val_loss: 0.0968
Epoch 358/500
 - 6s - loss: 0.0034 - val_loss: 0.0898
Epoch 359/500
 - 6s - loss: 0.0035 - val_loss: 0.0824
Epoch 360/500
 - 7s - loss: 0.0036 - val_loss: 0.0846
Epoch 361/500
 - 6s - loss: 0.0034 - val_loss: 0.0870
Epoch 362/500
 - 6s - loss: 0.0036 - val_loss: 0.0811
Epoch 363/500
 - 6s - loss: 0.0033 - val_loss: 0.0799
Epoch 364/500
 - 7s - loss: 0.0032 - val_loss: 0.0802
Epoch 365/500
 - 6s - loss: 0.0036 - val_loss: 0.0807
Epoch 366/500
 - 6s - loss: 0.0034 - val_loss: 0.0794
Epoch 367/500
 - 6s - loss: 0.0036 - val_loss: 0.0859
Epoch 368/500
 - 6s - loss: 0.0033 - val_loss: 0.0813
Epoch 369/500
 - 6s - loss: 0.0033 - val_loss: 0.0765
Epoch 370/500
 - 6s - loss: 0.0036 - val_loss: 0.0814
Epoch 371/500
 - 6s - loss: 0.0034 - val_loss: 0.0731
Epoch 372/500
 - 6s - loss: 0.0036 - val_loss: 0.0711
Epoch 373/500
 - 8s - loss: 0.0034 - val_loss: 0.0669
Epoch 374/500
 - 8s - loss: 0.0036 - val_loss: 0.0720
Epoch 375/500
 - 7s - loss: 0.0034 - val_loss: 0.0743
Epoch 376/500
 - 6s - loss: 0.0036 - val_loss: 0.0851
Epoch 377/500
 - 6s - loss: 0.0035 - val_loss: 0.0728
Epoch 378/500
 - 6s - loss: 0.0036 - val_loss: 0.0647
Epoch 379/500
 - 6s - loss: 0.0037 - val_loss: 0.0682
Epoch 380/500
 - 7s - loss: 0.0035 - val_loss: 0.0668
Epoch 381/500
 - 7s - loss: 0.0032 - val_loss: 0.0652
Epoch 382/500
 - 6s - loss: 0.0035 - val_loss: 0.0729
Epoch 383/500
 - 6s - loss: 0.0034 - val_loss: 0.0615
Epoch 384/500
 - 6s - loss: 0.0036 - val_loss: 0.0692
Epoch 385/500
 - 6s - loss: 0.0033 - val_loss: 0.0699
Epoch 386/500
 - 6s - loss: 0.0035 - val_loss: 0.0681
Epoch 387/500
 - 6s - loss: 0.0030 - val_loss: 0.0620
Epoch 388/500
 - 6s - loss: 0.0033 - val_loss: 0.0638
Epoch 389/500
 - 6s - loss: 0.0034 - val_loss: 0.0578
Epoch 390/500
 - 6s - loss: 0.0035 - val_loss: 0.0610
Epoch 391/500
 - 6s - loss: 0.0032 - val_loss: 0.0494
Epoch 392/500
 - 6s - loss: 0.0035 - val_loss: 0.0525
Epoch 393/500
 - 6s - loss: 0.0032 - val_loss: 0.0610
Epoch 394/500
 - 6s - loss: 0.0033 - val_loss: 0.0640
Epoch 395/500
 - 6s - loss: 0.0037 - val_loss: 0.0569
Epoch 396/500
 - 6s - loss: 0.0033 - val_loss: 0.0619
Epoch 397/500
 - 6s - loss: 0.0037 - val_loss: 0.0505
Epoch 398/500
 - 6s - loss: 0.0035 - val_loss: 0.0569
Epoch 399/500
 - 6s - loss: 0.0032 - val_loss: 0.0526
Epoch 400/500
 - 6s - loss: 0.0032 - val_loss: 0.0548
Epoch 401/500
 - 6s - loss: 0.0033 - val_loss: 0.0556
Epoch 402/500
 - 6s - loss: 0.0032 - val_loss: 0.0567
Epoch 403/500
 - 6s - loss: 0.0033 - val_loss: 0.0450
Epoch 404/500
 - 6s - loss: 0.0036 - val_loss: 0.0485
Epoch 405/500
 - 6s - loss: 0.0034 - val_loss: 0.0627
Epoch 406/500
 - 7s - loss: 0.0033 - val_loss: 0.0508
Epoch 407/500
 - 6s - loss: 0.0035 - val_loss: 0.0588
Epoch 408/500
 - 6s - loss: 0.0030 - val_loss: 0.0619
Epoch 409/500
 - 6s - loss: 0.0033 - val_loss: 0.0690
Epoch 410/500
 - 6s - loss: 0.0034 - val_loss: 0.0730
Epoch 411/500
 - 6s - loss: 0.0035 - val_loss: 0.0765
Epoch 412/500
 - 6s - loss: 0.0034 - val_loss: 0.0794
Epoch 413/500
 - 6s - loss: 0.0035 - val_loss: 0.0725
Epoch 414/500
 - 6s - loss: 0.0035 - val_loss: 0.0656
Epoch 415/500
 - 6s - loss: 0.0035 - val_loss: 0.0703
Epoch 416/500
 - 6s - loss: 0.0033 - val_loss: 0.0606
Epoch 417/500
 - 6s - loss: 0.0035 - val_loss: 0.0598
Epoch 418/500
 - 6s - loss: 0.0036 - val_loss: 0.0596
Epoch 419/500
 - 6s - loss: 0.0033 - val_loss: 0.0519
Epoch 420/500
 - 6s - loss: 0.0033 - val_loss: 0.0571
Epoch 421/500
 - 6s - loss: 0.0034 - val_loss: 0.0592
Epoch 422/500
 - 6s - loss: 0.0034 - val_loss: 0.0567
Epoch 423/500
 - 6s - loss: 0.0035 - val_loss: 0.0591
Epoch 424/500
 - 6s - loss: 0.0036 - val_loss: 0.0563
Epoch 425/500
 - 6s - loss: 0.0033 - val_loss: 0.0507
Epoch 426/500
 - 6s - loss: 0.0033 - val_loss: 0.0496
Epoch 427/500
 - 6s - loss: 0.0037 - val_loss: 0.0554
Epoch 428/500
 - 6s - loss: 0.0030 - val_loss: 0.0657
Epoch 429/500
 - 6s - loss: 0.0032 - val_loss: 0.0543
Epoch 430/500
 - 7s - loss: 0.0036 - val_loss: 0.0512
Epoch 431/500
 - 6s - loss: 0.0032 - val_loss: 0.0494
Epoch 432/500
 - 6s - loss: 0.0032 - val_loss: 0.0548
Epoch 433/500
 - 6s - loss: 0.0033 - val_loss: 0.0545
Epoch 434/500
 - 6s - loss: 0.0032 - val_loss: 0.0516
Epoch 435/500
 - 6s - loss: 0.0032 - val_loss: 0.0481
Epoch 436/500
 - 6s - loss: 0.0034 - val_loss: 0.0461
Epoch 437/500
 - 6s - loss: 0.0036 - val_loss: 0.0410
Epoch 438/500
 - 7s - loss: 0.0033 - val_loss: 0.0443
Epoch 439/500
 - 6s - loss: 0.0031 - val_loss: 0.0484
Epoch 440/500
 - 6s - loss: 0.0036 - val_loss: 0.0473
Epoch 441/500
 - 6s - loss: 0.0035 - val_loss: 0.0442
Epoch 442/500
 - 6s - loss: 0.0033 - val_loss: 0.0383
Epoch 443/500
 - 6s - loss: 0.0038 - val_loss: 0.0561
Epoch 444/500
 - 6s - loss: 0.0030 - val_loss: 0.0426
Epoch 445/500
 - 6s - loss: 0.0033 - val_loss: 0.0501
Epoch 446/500
 - 7s - loss: 0.0033 - val_loss: 0.0546
Epoch 447/500
 - 8s - loss: 0.0037 - val_loss: 0.0542
Epoch 448/500
 - 6s - loss: 0.0034 - val_loss: 0.0582
Epoch 449/500
 - 7s - loss: 0.0036 - val_loss: 0.0526
Epoch 450/500
 - 7s - loss: 0.0033 - val_loss: 0.0645
Epoch 451/500
 - 7s - loss: 0.0032 - val_loss: 0.0520
Epoch 452/500
 - 7s - loss: 0.0036 - val_loss: 0.0479
Epoch 453/500
 - 7s - loss: 0.0032 - val_loss: 0.0498
Epoch 454/500
 - 7s - loss: 0.0033 - val_loss: 0.0611
Epoch 455/500
 - 7s - loss: 0.0031 - val_loss: 0.0590
Epoch 456/500
 - 6s - loss: 0.0033 - val_loss: 0.0573
Epoch 457/500
 - 6s - loss: 0.0034 - val_loss: 0.0463
Epoch 458/500
 - 7s - loss: 0.0033 - val_loss: 0.0525
Epoch 459/500
 - 6s - loss: 0.0034 - val_loss: 0.0500
Epoch 460/500
 - 6s - loss: 0.0035 - val_loss: 0.0473
Epoch 461/500
 - 8s - loss: 0.0032 - val_loss: 0.0508
Epoch 462/500
 - 9s - loss: 0.0033 - val_loss: 0.0439
Epoch 463/500
 - 7s - loss: 0.0037 - val_loss: 0.0623
Epoch 464/500
 - 7s - loss: 0.0034 - val_loss: 0.0527
Epoch 465/500
 - 7s - loss: 0.0035 - val_loss: 0.0563
Epoch 466/500
 - 7s - loss: 0.0033 - val_loss: 0.0518
Epoch 467/500
 - 7s - loss: 0.0032 - val_loss: 0.0497
Epoch 468/500
 - 6s - loss: 0.0033 - val_loss: 0.0543
Epoch 469/500
 - 6s - loss: 0.0033 - val_loss: 0.0554
Epoch 470/500
 - 7s - loss: 0.0035 - val_loss: 0.0587
Epoch 471/500
 - 7s - loss: 0.0032 - val_loss: 0.0513
Epoch 472/500
 - 7s - loss: 0.0032 - val_loss: 0.0531
Epoch 473/500
 - 6s - loss: 0.0034 - val_loss: 0.0547
Epoch 474/500
 - 7s - loss: 0.0033 - val_loss: 0.0607
Epoch 475/500
 - 7s - loss: 0.0031 - val_loss: 0.0598
Epoch 476/500
 - 8s - loss: 0.0032 - val_loss: 0.0616
Epoch 477/500
 - 7s - loss: 0.0033 - val_loss: 0.0686
Epoch 478/500
 - 8s - loss: 0.0032 - val_loss: 0.0532
Epoch 479/500
 - 7s - loss: 0.0033 - val_loss: 0.0579
Epoch 480/500
 - 6s - loss: 0.0033 - val_loss: 0.0654
Epoch 481/500
 - 7s - loss: 0.0032 - val_loss: 0.0522
Epoch 482/500
 - 7s - loss: 0.0034 - val_loss: 0.0603
Epoch 483/500
 - 6s - loss: 0.0032 - val_loss: 0.0732
Epoch 484/500
 - 8s - loss: 0.0034 - val_loss: 0.0612
Epoch 485/500
 - 7s - loss: 0.0034 - val_loss: 0.0580
Epoch 486/500
 - 7s - loss: 0.0035 - val_loss: 0.0611
Epoch 487/500
 - 6s - loss: 0.0032 - val_loss: 0.0604
Epoch 488/500
 - 6s - loss: 0.0030 - val_loss: 0.0445
Epoch 489/500
 - 6s - loss: 0.0033 - val_loss: 0.0413
Epoch 490/500
 - 7s - loss: 0.0036 - val_loss: 0.0578
Epoch 491/500
 - 7s - loss: 0.0032 - val_loss: 0.0535
Epoch 492/500
 - 7s - loss: 0.0032 - val_loss: 0.0603
Epoch 493/500
 - 7s - loss: 0.0033 - val_loss: 0.0499
Epoch 494/500
 - 6s - loss: 0.0036 - val_loss: 0.0662
Epoch 495/500
 - 7s - loss: 0.0033 - val_loss: 0.0524
Epoch 496/500
 - 6s - loss: 0.0033 - val_loss: 0.0583
Epoch 497/500
 - 6s - loss: 0.0034 - val_loss: 0.0534
Epoch 498/500
 - 6s - loss: 0.0032 - val_loss: 0.0474
Epoch 499/500
 - 6s - loss: 0.0032 - val_loss: 0.0504
Epoch 500/500
 - 6s - loss: 0.0030 - val_loss: 0.0488
In [66]:
trloss_trace = go.Scatter(y=history.history['loss'],  name = 'Training Loss', line = dict(color = ('rgb(205, 12, 24)'), width = 1))
teloss_trace = go.Scatter(y=history.history['val_loss'], name = 'Testing Loss', line = dict(color = ('rgb(22, 96, 167)'), width = 0.5))
layout = dict(title = 'Training Loss vs Testing Loss')
fig = dict(data=[trloss_trace, teloss_trace], layout=layout)
py.iplot(fig)

Test Model

In [67]:
# make a prediction
ypred = model.predict(test_X)
test_X = test_X.reshape((test_X.shape[0], n_days* n_features))

# invert scaling for forecast
inv_ypred = np.concatenate((ypred, test_X[:, -1:]), axis=1)
inv_ypred = scaler.inverse_transform(inv_ypred)
inv_ypred = inv_ypred[:,0]

# invert scaling for actual
test_y = test_y.reshape((len(test_y), 1))
inv_y = np.concatenate((test_y, test_X[:, -1:]), axis=1)
inv_y = scaler.inverse_transform(inv_y)
inv_y = inv_y[:,0]

# calculate RMSE
rmse = np.sqrt(mean_squared_error(inv_y, inv_ypred))
print('Test RMSE: {:.3f}'.format(rmse))
Test RMSE: 1184.328

Model Evaluation and Performance

In [78]:
# Calculate the Mean of the predictions
mean_test_pred = np.mean(inv_ypred)
# Calculating the Coefficient of Variation of the predictions
cv_lstm = rmse/mean_test_pred
print("Coefficient of Variation (in percentage) for the LSTM model is {:.6f}".format(cv_lstm*100))
Coefficient of Variation (in percentage) for the LSTM model is 22.168653
In [68]:
avg_cp_trace = go.Scatter(y=inv_y, name = 'Average Close Price (USD)', line = dict(color = ('rgb(205, 12, 24)'), width = 4))
pred_avg_cp_trace = go.Scatter(y=inv_ypred, name = 'Predicted Average Close Price (USD)', line = dict(color = ('rgb(22, 96, 167)'), width = 4))
layout = dict(title = 'Average Close Price (USD) vs Predicted Average Close Price (USD)')
fig = dict(data=[avg_cp_trace, pred_avg_cp_trace], layout=layout)
py.iplot(fig)

Conclusion

An evaluation of both appproches suggested the first framework to perform better. This could be as a result of a wider range of more accurate unbiased features (Technical Stock Market Indicators) as opposed to the biased feature of sentiment of news articles.

Despite LSTM being an improvisation over the traditional RNN; in my framework it performs worse owing to the lack of reliable features both on qualitative as well as quantitative levels.

In the future I hope to extend my hypothesis to include more relevant features and also consider better hyperparameter optimization techniques to ensure better prediction results based on opinion analysis on a wider range of audience.

References

\[1\] Reid F, Harrigan M. An analysis of anonymity in the bitcoin system: Springer; 2013.
\[2\] Böhme R, Christin N, Edelman B, Moore T. Bitcoin: Economics, technology, and governance. The Journal of Economic Perspectives. 2015;29(2):213–38.
\[3\] Nakamoto S. Bitcoin: A peer-to-peer electronic cash system. 2008.
\[4\] Kondor D, Pósfai M, Csabai I, Vattay G. Do the rich get richer? An empirical analysis of the Bitcoin transaction network. PloS one. 2014;9(2):e86197 doi: 10.1371/journal.pone.0086197 [PMC free article] [PubMed]
\[5\] Ron D, Shamir A. Quantitative analysis of the full bitcoin transaction graph Financial Cryptography and Data Security: Springer; 2013. p. 6–24.
\[6\] Garcia D, Tessone CJ, Mavrodiev P, Perony N. The digital traces of bubbles: feedback cycles between socio-economic signals in the Bitcoin economy. Journal of the Royal Society Interface. 2014;11(99):20140623. [PMC free article] [PubMed]
[7] Kondor D, Csabai I, Szüle J, Pósfai M, Vattay G. Inferring the interplay between network structure and market effects in Bitcoin. New Journal of Physics. 2014;16(12):125003.
[8] Kristoufek L. BitCoin meets Google Trends and Wikipedia: Quantifying the relationship between phenomena of the Internet era. Scientific reports. 2013;3. [PMC free article] [PubMed]
[9] Kristoufek L. What are the main drivers of the Bitcoin price? Evidence from wavelet coherence analysis. PloS one. 2015;10(4):e0123923 doi: 10.1371/journal.pone.0123923 [PMC free article] [PubMed]
[10] Yelowitz A, Wilson M. Characteristics of Bitcoin users: an analysis of Google search data. Applied Economics Letters. 2015;22(13):1030–6.